Step 2: Create a Kinesis Data Firehose Delivery Stream with Splunk as a Destination
In this part of the Kinesis Data Firehose tutorial, you create an Amazon Kinesis Data Firehose delivery stream to receive the log data from Amazon CloudWatch and deliver that data to Splunk.
The logs that CloudWatch sends to the delivery stream are in a compressed format. However, Kinesis Data Firehose can't send compressed logs to Splunk. Therefore, when you create the delivery stream in the following procedure, you enable data transformation and configure an Amazon Lambda function to uncompress the log data. Kinesis Data Firehose then sends the uncompressed data to Splunk.
To create a Kinesis Data Firehose delivery stream with Splunk as a destination
Open the Kinesis Data Firehose console at https://console.amazonaws.cn/firehose/
. -
Choose Create delivery stream.
-
For the name of the delivery stream, enter
VPCtoSplunkStream
. Then scroll to the bottom, and choose Next. -
For Data transformation*, choose Enabled.
-
For Lambda function*, choose Create new.
-
In the Choose Lambda blueprint pane, scroll down and choose Kinesis Firehose Cloudwatch Logs Processor. This opens the Amazon Lambda console.
Note
It is recommended that you choose the low Lambda buffering hint value of 256 KB.
-
On the Amazon Lambda console, for the function name, enter
VPCtoSplunkLambda
. -
In the description text under Execution role, choose the IAM console link to create a custom role. This opens the Amazon Identity and Access Management (IAM) console.
-
In the IAM console, choose Lambda.
-
Choose Next: Permissions.
-
Choose Create policy.
-
Choose the JSON tab and replace the existing JSON with the following. Be sure to replace the
your-region
andyour-aws-account-id
placeholders with your Amazon Region code and account ID. Don't include any hyphens or dashes in the account ID. For a list of Amazon Region codes, see Amazon Regions and Endpoints. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "firehose:PutRecordBatch" ], "Resource": [ "arn:aws:firehose:your-region:your-aws-account-id:deliverystream/VPCtoSplunkStream" ] } ] }
This policy allows the Lambda function to put data back into the delivery stream by invoking the
PutRecordBatch
operation. This step is needed because a Lambda function can only return up to 6 MiB of data every time Kinesis Data Firehose invokes it. If the size of the uncompressed data exceeds 6 MiB, the function invokesPutRecordBatch
to put some of the data back into the delivery stream for future processing. -
Back in the Create role window, refresh the list of policies, then choose VPCtoSplunkLambdaPolicy by selecting the box to its left.
-
Choose Next: Tags.
-
Choose Next: Review.
-
For Role Name, enter
VPCtoSplunkLambdaRole
, then choose Create role. -
Back in the Lambda console, refresh the list of existing roles, then select
VPCtoSplunkLambdaRole
. -
Scroll down and choose Create function.
-
In the Lambda function pane, scroll down to the Basic settings section, and increase the timeout to
3
minutes. -
Scroll up and choose Save.
-
Back in the Choose Lambda blueprint dialog box, choose Close.
-
On the delivery stream creation page, under the Transform source records with Amazon Lambda section, choose the refresh button. Then choose VPCtoSplunkLambda in the list of functions.
-
Scroll down and choose Next.
-
For Destination*, choose Splunk.
-
For Splunk cluster endpoint, see the information at Configure Amazon Kinesis Firehose to send data to the Splunk platform
in the Splunk documentation. -
Keep Splunk endpoint type set to Raw endpoint.
-
Enter the value (and not the name) of your Splunk HTTP Event Collector (HEC) token.
-
For S3 backup mode*, choose Backup all events.
-
Choose an existing Amazon S3 bucket (or create a new one if you want), and choose Next.
-
On the Configure settings page, scroll down to the IAM role section, and choose Create new or choose.
-
In the IAM role list, choose Create a new IAM role. For Role Name, enter
VPCtoSplunkLambdaFirehoseRole
, and then choose Allow. -
Choose Next, and review the configuration that you chose for the delivery stream. Then choose Create delivery stream.
Proceed to Step 3: Send the Data from Amazon CloudWatch to Kinesis Data Firehose.