Amazon EventBridge Pipes targets - Amazon EventBridge
Services or capabilities described in Amazon Web Services documentation might vary by Region. To see the differences applicable to the China Regions, see Getting Started with Amazon Web Services in China (PDF).

Amazon EventBridge Pipes targets

You can send data in your pipe to a specific target. You can configure the following targets when setting up a pipe in EventBridge:

  • API destination

  • API Gateway

  • Batch job queue

  • CloudWatch log group

  • ECS task

  • Event bus in the same account and Region

  • Firehose delivery stream

  • Inspector assessment template

  • Kinesis stream

  • Lambda function (SYNC or ASYNC)

  • Redshift cluster data API queries

  • SageMaker Pipeline

  • SNS topic

  • SQS queue

  • Step Functions state machine

    • Express workflows (SYNC or ASYNC)

    • Standard workflows (ASYNC)

Target parameters

Some target services don't send the event payload to the target, instead, they treat the event as a trigger for invoking a specific API. EventBridge uses the PipeTargetParameters to specify what information gets sent to that API. These include the following:


EventBridge does not support all JSON Path syntax and evaluate it at runtime. Supported syntax includes:

  • dot notation (for example,$.detail)

  • dashes

  • underscores

  • alphanumeric characters

  • array indices

  • wildcards (*)

Dynamic path parameters

EventBridge Pipes target parameters support optional dynamic JSON path syntax. You can use this syntax to specify JSON paths instead of static values (for example $.detail.state). The entire value has to be a JSON path, not only part of it. For example, RedshiftParameters.Sql can be $.detail.state but it can't be "SELECT * FROM $.detail.state". These paths are replaced dynamically at runtime with data from the event payload itself at the specified path. Dynamic path parameters can't reference new or transformed values resulting from input transformation. The supported syntax for dynamic parameter JSON paths is the same as when transforming input. For more information, see Amazon EventBridge Pipes input transformation.

Dynamic syntax can be used on all string, non-enum fields of all EventBridge Pipes enrichment and target parameters except:

For example, to set the PartitionKey of a pipe Kinesis target to a custom key from your source event, set the KinesisTargetParameter.PartitionKey to:

  • "$.data.someKey" for a Kinesis source

  • "$.body.someKey" for an Amazon SQS source

Then, if the event payload is a valid JSON string, such as {"someKey":"someValue"}, EventBridge extracts the value from the JSON path and uses it as the target parameter. In this example, EventBridge would set the Kinesis PartitionKey to "someValue".


To make API calls on the resources that you own, EventBridge Pipes needs appropriate permission. EventBridge PIpes uses the IAM role that you specify on the pipe for enrichment and target calls using the IAM principal

EventBridge Pipes target specifics

Amazon Batch job queues

All Amazon Batch submitJob parameters are configured explicitly with BatchParameters, and as with all Pipe parameters, these can be dynamic using a JSON path to your incoming event payload.

CloudWatch Logs group

Whether you use an input transformer or not, the event payload is used as the log message. You can set the Timestamp (or the explicit LogStreamName of your destination) through CloudWatchLogsParameters in PipeTarget. As with all pipe parameters, these parameters can be dynamic when using a JSON path to your incoming event payload.

Amazon ECS task

All Amazon ECS runTask parameters are configured explicitly through EcsParameters. As with all pipe parameters, these parameters can be dynamic when using a JSON path to your incoming event payload.