EventBridge Pipes target specifics
Amazon Batch job queues
All Amazon Batch submitJob
parameters are configured explicitly with
BatchParameters
, and as with all Pipe parameters, these can be dynamic using
a JSON path to your incoming event payload.
CloudWatch Logs group
Whether you use an input transformer or not, the event payload is used as the log
message. You can set the Timestamp
(or the explicit LogStreamName
of your destination) through CloudWatchLogsParameters
in
PipeTarget
. As with all pipe parameters, these parameters can be dynamic when
using a JSON path to your incoming event payload.
Amazon ECS task
All Amazon ECS runTask
parameters are configured explicitly through
EcsParameters
. As with all pipe parameters, these parameters can be dynamic
when using a JSON path to your incoming event payload.
Lambda functions and Step Functions workflows
Lambda and Step Functions do not have a batch API. To process batches of events from a pipe source, the batch is converted to a JSON array and passed to as input to the Lambda or Step Functions target. For more information, see Amazon EventBridge Pipes batching and concurrency.
Timestream for LiveAnalytics table
Considerations when specifying a Timestream for LiveAnalytics table as a pipe target include:
-
Apache Kafka streams (including fromAmazon MSK or third-party providers) are not currently supported as a pipe source.
-
If you have specified a Kinesis or DynamoDB stream as the pipe source, you must specify the number of retry attempts.
For more information, see Configuring the pipe settings.