Example Amazon Event Fork Pipelines use case
The following scenario describes an event-driven, serverless e-commerce application that
uses Amazon Event Fork Pipelines. You can use this example e-commerce application
This e-commerce application takes orders from buyers through a RESTful API hosted by API Gateway
and backed by the Amazon Lambda function CheckoutApiBackendFunction
. This
function publishes all received orders to an Amazon SNS topic named
CheckoutEventsTopic
which, in turn, fans out the orders to four
different pipelines.
The first pipeline is the regular checkout-processing pipeline designed and
implemented by the owner of the e-commerce application. This pipeline has the Amazon SQS
queue CheckoutQueue
that buffers all received orders, an Amazon Lambda
function named CheckoutFunction
that polls the queue to process these
orders, and the DynamoDB table CheckoutTable
that securely saves all
placed orders.
Applying Amazon Event Fork Pipelines
The components of the e-commerce application handle the core business logic. However, the e-commerce application owner also needs to address the following:
-
Compliance—secure, compressed backups encrypted at rest and sanitization of sensitive information
-
Resiliency—replay of most recent orders in case of the disruption of the fulfillment process
-
Searchability—running analytics and generating metrics on placed orders
Instead of implementing this event processing logic, the application owner can
subscribe Amazon Event Fork Pipelines to the CheckoutEventsTopic
Amazon SNS
topic
-
The event storage and backup pipeline is configured to transform data to remove credit card details, buffer data for 60 seconds, compress it using GZIP, and encrypt it using the default customer managed key for Amazon S3. This key is managed by Amazon and powered by the Amazon Key Management Service (Amazon KMS).
For more information, see Choose Amazon S3 For Your Destination, Amazon Data Firehose Data Transformation, and Configure Settings in the Amazon Data Firehose Developer Guide.
-
The event search and analytics pipeline is configured with an index retry duration of 30 seconds, a bucket for storing orders that fail to be indexed in the search domain, and a filter policy to restrict the set of indexed orders.
For more information, see Choose OpenSearch Service for your Destination in the Amazon Data Firehose Developer Guide.
-
The event replay pipeline is configured with the Amazon SQS queue part of the regular order-processing pipeline designed and implemented by the e-commerce application owner.
For more information, see Queue Name and URL in the Amazon Simple Queue Service Developer Guide.
The following JSON filter policy is set in the configuration for the Event Search and Analytics Pipeline. It matches only incoming orders in which the total amount is $100 or higher. For more information, see Amazon SNS message filtering.
{
"amount": [{ "numeric": [ ">=", 100 ] }]
}
Using the Amazon Event Fork Pipelines pattern, the e-commerce application owner can avoid the development overhead that often follows coding undifferentiating logic for event handling. Instead, she can deploy Amazon Event Fork Pipelines directly from the Amazon Serverless Application Repository into her Amazon Web Services account.