Use generative AI in SageMaker notebook environments - Amazon SageMaker
Services or capabilities described in Amazon Web Services documentation might vary by Region. To see the differences applicable to the China Regions, see Getting Started with Amazon Web Services in China (PDF).

Use generative AI in SageMaker notebook environments

Jupyter AI is an open-source extension of JupyterLab integrating generative AI capabilities into Jupyter notebooks. Through the Jupyter AI chat interface and magic commands, users experiment with code generated from natural language instructions, explain existing code, ask questions about their local files, generate entire notebooks, and more. The extension connects Jupyter notebooks with large language models (LLMs) that users can use to generate text, code, or images, and to ask questions about their own data. Jupyter AI supports generative model providers such as AI21, Anthropic, Amazon (JumpStart and Amazon Bedrock), Cohere, and OpenAI.

The extension's package is included in Amazon SageMaker Distribution version 1.2 and onwards. Amazon SageMaker Distribution is a Docker environment for data science and scientific computing used as the default image of JupyterLab notebook instances. Users of different IPython environments can install Jupyter AI manually.

In this section, we provide an overview of Jupyter AI capabilities and demonstrate how to configure models provided by JumpStart or Amazon Bedrock from JupyterLab or Studio Classic notebooks. For more in-depth information on the Jupyter AI project, refer to its documentation. Alternatively, you can refer to the blog post Generative AI in Jupyter for an overview and examples of key Jupyter AI capabilities.

Before using Jupyter AI and interacting with your LLMs, make sure that you satisfy the following prerequisites:

  • For models hosted by Amazon, you should have the ARN of your SageMaker endpoint or have access to Amazon Bedrock. For other model providers, you should have the API key used to authenticate and authorize requests to your model. Jupyter AI supports a wide range of model providers and language models, refer to the list of its supported models to stay updated on the latest available models. For information on how to deploy a model in JumpStart, see Deploy a Model in the JumpStart documentation. You need to request access to Amazon Bedrock to use it as your model provider.

  • Ensure that Jupyter AI libraries are present in your environment. If not, install the required package by following the instructions in Install Jupyter AI.

  • Familiarize yourself with the capabilities of Jupyter AI in Jupyter AI Features.

  • Configure the target models you wish to use by following the instructions in Configure your model provider.

After completing the prerequisite steps, you can proceed to Use Jupyter AI in JupyterLab or Studio Classic.