OpenSearch Service flow framework templates
Amazon OpenSearch Service flow framework templates allow you to automate complex OpenSearch Service setup and preprocessing tasks by providing templates for common use cases. For example, you can use flow framework templates to automate machine learning setup tasks. Amazon OpenSearch Service flow framework templates provide a compact description of the setup process in a JSON or YAML document. These templates describe automated workflow configurations for conversational chat or query generation, AI connectors, tools, agents, and other components that prepare OpenSearch Service for backend use for generative models.
Amazon OpenSearch Service flow framework templates can be customized to meet your specific needs. To see an
example of a custom flow framework template, see flow-framework
Creating ML connectors in OpenSearch Service
Amazon OpenSearch Service flow framework templates allow you to configure and install ML connectors by
utilizing the create connector API offered in ml-commons. You can use ML connectors
to connect OpenSearch Service to other Amazon services or third party platforms. For more
information on this, see Creating connectors for third-party ML platforms
Before you can create a connector in OpenSearch Service, you must do the following:
-
Create an Amazon SageMaker domain.
-
Create an IAM role.
-
Configure pass role permission.
-
Map the flow-framework and ml-commons roles in OpenSearch Dashboards.
For more information on how to setup ML connectors for Amazon services, see Amazon OpenSearch Service ML connectors for Amazon services
Creating a connector through a flow-framework service
To create a flow-framework template with connector, you will need to send a
POST
request to your OpenSearch Service domain endpoint. You can use cURL, a
sample Python client, Postman, or another method to send a signed request. The
POST
request takes the following format:
POST /_plugins/_flow_framework/workflow { "name": "Deploy Claude Model", "description": "Deploy a model using a connector to Claude", "use_case": "PROVISION", "version": { "template": "1.0.0", "compatibility": [ "2.12.0", "3.0.0" ] }, "workflows": { "provision": { "nodes": [ { "id": "create_claude_connector", "type": "create_connector", "user_inputs": { "name": "Claude Instant Runtime Connector", "version": "1", "protocol": "aws_sigv4", "description": "The connector to BedRock service for Claude model", "actions": [ { "headers": { "x-amz-content-sha256": "required", "content-type": "application/json" }, "method": "POST", "request_body": "{ \"prompt\":\"${parameters.prompt}\", \"max_tokens_to_sample\":${parameters.max_tokens_to_sample}, \"temperature\":${parameters.temperature}, \"anthropic_version\":\"${parameters.anthropic_version}\" }", "action_type": "predict", "url": "https://bedrock-runtime.us-west-2.amazonaws.com/model/anthropic.claude-instant-v1/invoke" } ], "credential": { "roleArn": "arn:aws:iam::account-id:role/opensearch-secretmanager-role" }, "parameters": { "endpoint": "bedrock-runtime.us-west-2.amazonaws.com", "content_type": "application/json", "auth": "Sig_V4", "max_tokens_to_sample": "8000", "service_name": "bedrock", "temperature": "0.0001", "response_filter": "$.completion", "region": "us-west-2", "anthropic_version": "bedrock-2023-05-31" } } } ] } } }
If your domain resides within a virtual private cloud (Amazon VPC), you must be
connected to the Amazon VPC for the request to successfully create the AI connector.
Accessing an Amazon VPC varies by network configuration, but usually involves
connecting to a VPN or corporate network. To check that you can reach your OpenSearch Service
domain, navigate to https://your-vpc-domain.region.es.amazonaws.com
in a web browser and verify that you receive the default JSON response.
Sample Python client
The Python client is simpler to automate than a HTTP
request and
has better reusability. To create the AI connector with the Python client,
save the following sample code to a Python file. The client requires the
Amazon SDK for Python (Boto3)
import boto3 import requests from requests_aws4auth import AWS4Auth host = 'domain-endpoint/' region = 'region' service = 'es' credentials = boto3.Session().get_credentials() awsauth = AWS4Auth(credentials.access_key, credentials.secret_key, region, service, session_token=credentials.token) path = '_plugins/_flow_framework/workflow' url = host + path payload = { "name": "Deploy Claude Model", "description": "Deploy a model using a connector to Claude", "use_case": "PROVISION", "version": { "template": "1.0.0", "compatibility": [ "2.12.0", "3.0.0" ] }, "workflows": { "provision": { "nodes": [ { "id": "create_claude_connector", "type": "create_connector", "user_inputs": { "name": "Claude Instant Runtime Connector", "version": "1", "protocol": "aws_sigv4", "description": "The connector to BedRock service for Claude model", "actions": [ { "headers": { "x-amz-content-sha256": "required", "content-type": "application/json" }, "method": "POST", "request_body": "{ \"prompt\":\"${parameters.prompt}\", \"max_tokens_to_sample\":${parameters.max_tokens_to_sample}, \"temperature\":${parameters.temperature}, \"anthropic_version\":\"${parameters.anthropic_version}\" }", "action_type": "predict", "url": "https://bedrock-runtime.us-west-2.amazonaws.com/model/anthropic.claude-instant-v1/invoke" } ], "credential": { "roleArn": "arn:aws:iam::account-id:role/opensearch-secretmanager-role" }, "parameters": { "endpoint": "bedrock-runtime.us-west-2.amazonaws.com", "content_type": "application/json", "auth": "Sig_V4", "max_tokens_to_sample": "8000", "service_name": "bedrock", "temperature": "0.0001", "response_filter": "$.completion", "region": "us-west-2", "anthropic_version": "bedrock-2023-05-31" } } } ] } } } headers = {"Content-Type": "application/json"} r = requests.post(url, auth=awsauth, json=payload, headers=headers) print(r.status_code) print(r.text)
Pre-defined workflow templates
Amazon OpenSearch Service provides several workflow templates for some common machine learning (ML) use cases. Using a template simplifies complex setups and provides many default values for use cases like semantic or conversational search. You can specify a workflow template when you call the Create Workflow API.
-
To use an OpenSearch Service provided workflow template, specify the template use case as the
use_case
query parameter. -
To use a custom workflow template, provide the complete template in the request body. For an, example of a custom template, see an example JSON template or an example YAML template.
Template Use Cases
This table provides an overview of the different templates available, a description of the templates, and the required parameters.
Template use case | Description | Required Parameters |
---|---|---|
|
Creates and deploys an Amazon Bedrock embedding model (by default, |
|
|
Creates and deploys an Amazon Bedrock multimodal embedding model (by default, |
|
|
Creates and deploys a Cohere embedding model (by default, embed-english-v3.0). |
|
|
Creates and deploys a Cohere chat model (by default, Cohere Command). |
|
|
Creates and deploys an OpenAI embedding model (by default, text-embedding-ada-002). |
|
|
Creates and deploys an OpenAI chat model (by default, gpt-3.5-turbo). |
|
|
Configures semantic search and deploys a Cohere embedding model. You must provide the API key for the Cohere model. |
|
|
Configures semantic search and deploys a Cohere embedding model. Adds a query_enricher search processor that sets a default model ID for neural queries. You must provide the API key for the Cohere model. |
|
|
Deploys an Amazon Bedrock multimodal model and configures an ingestion pipeline with a text_image_embedding processor and a k-NN index for multimodal search. You must provide your Amazon credentials. |
|
Note
For all templates that require a secret ARN, the default is to store the secret with a key name of "key" in Amazon Secrets mangager.
Default templates with pretrained models
Amazon OpenSearch Service offers two additonal default workflow templates not available in the opensource OpenSearch Service.
Template use case | Description |
---|---|
|
Configures semantic search |
|
Configures hybrid search |
Configure permissions
If you create a new domain with version 2.13 or later, permissions are already
in place. If you enable flow framework on a preexisting OpenSearch Service domain with version
2.11 or earlier that you then upgrade to version 2.13 or later, you must define
the flow_framework_manager
role. Non-admin users must be mapped to
this role in order to manage warm indexes on domains using fine-grained access
control. To manually create the flow_framework_manager
role,
perform the following steps:
-
In OpenSearch Dashboards, go to Security and choose Permissions.
-
Choose Create action group and configure the following groups:
Group name Permissions flow_framework_full_access
-
cluster:admin/opensearch/flow_framework/*
-
cluster_monitor
flow_framework_read_accesss
-
cluster:admin/opensearch/flow_framework/workflow/get
-
cluster:admin/opensearch/flow_framework/workflow/search
-
cluster:admin/opensearch/flow_framework/workflow_state/get
-
cluster:admin/opensearch/flow_framework/workflow_state/search
-
-
Choose Roles and Create role.
-
Name the role flow_framework_manager.
-
For Cluster permissions, select
flow_framework_full_access
andflow_framework_read_access
. -
For Index, type
*
. -
For Index permissions, select
indices:admin/aliases/get
,indices:admin/mappings/get
, andindices_monitor
. -
Choose Create.
-
After you create the role, map it to any user or backend role that will manage flow framework indexes.