Extended Client Library for Python - Amazon Simple Notification Service
Services or capabilities described in Amazon Web Services documentation might vary by Region. To see the differences applicable to the China Regions, see Getting Started with Amazon Web Services in China (PDF).

Extended Client Library for Python

Prerequisites

The following are the prerequisites for using the Amazon SNS Extended Client Library for Python:

  • An Amazon SDK.

    The example on this page uses Amazon Python SDK Boto3. To install and set up the SDK, see the Amazon SDK for Python documentation.

  • An Amazon Web Services account with the proper credentials.

    To create an Amazon Web Services account, navigate to the Amazon home page, and then choose Create an Amazon Account. Follow the instructions.

    For information about credentials, see Credentials in the Amazon SDK for Python Developer Guide.

  • Python 3.x (or later) and pip.

  • The Amazon SNS Extended Client Library for Python (also available from PyPI).

Configuring message storage

The below attributes are available on Boto3 Amazon SNS Client, Topic, and PlatformEndpoint objects to configure the Amazon S3 message storage options.

  • large_payload_support – The Amazon S3 bucket name to store large messages.

  • message_size_threshold – The threshold for storing the message in the large messages bucket. Cannot be less than 0, or greater than 262144. The default is 262144.

  • always_through_s3 – If True, then all messages are stored in Amazon S3. The default is False.

  • s3 – The Boto3 Amazon S3 resource object used to store objects in Amazon S3. Use this if you want to control the Amazon S3 resource (for example, a custom Amazon S3 config or credentials). If not previously set on first use, the default is boto3.resource("s3").

Example: Publishing messages to Amazon SNS with the payload stored in Amazon S3

The following code example shows how to:

  • Create a sample Amazon SNS topic and Amazon SQS queue.

  • Subscribe the queue to receive messages from the topic.

  • Publish a test message.

  • The message payload is stored in Amazon S3, and the reference to it is published.

  • Print the published message from the queue along with the original message retrieved from Amazon S3.

To publish a large message, use the Amazon SNS Extended Client Library for Python. The message you send references an Amazon S3 object containing the actual message content.

import boto3 import sns_extended_client from json import loads s3_extended_payload_bucket = "extended-client-bucket-store" TOPIC_NAME = "---TOPIC-NAME---" QUEUE_NAME = "---QUEUE-NAME---" # Create an helper to fetch message from S3 def get_msg_from_s3(body): json_msg = loads(body) s3_client = boto3.client("s3") s3_object = s3_client.get_object( Bucket=json_msg[1].get("s3BucketName"), Key=json_msg[1].get("s3Key") ) msg = s3_object.get("Body").read().decode() return msg # Create an helper to fetch and print message SQS queue and S3 def fetch_and_print_from_sqs(sqs, queue_url): """Handy Helper to fetch and print message from SQS queue and S3""" message = sqs.receive_message( QueueUrl=queue_url, MessageAttributeNames=["All"], MaxNumberOfMessages=1 ).get("Messages")[0] message_body = message.get("Body") print("Published Message: {}".format(message_body)) print("Message Stored in S3 Bucket is: {}\n".format(get_msg_from_s3(message_body))) # Initialize the SNS client and create SNS Topic sns_extended_client = boto3.client("sns", region_name="us-east-1") create_topic_response = sns_extended_client.create_topic(Name=TOPIC_NAME) demo_topic_arn = create_topic_response.get("TopicArn") # Create and subscribe an SQS queue to the SNS client sqs = boto3.client("sqs") demo_queue_url = sqs.create_queue(QueueName=QUEUE_NAME).get("QueueUrl") demo_queue_arn = sqs.get_queue_attributes(QueueUrl=demo_queue_url, AttributeNames=["QueueArn"])["Attributes"].get("QueueArn") # Set the RawMessageDelivery subscription attribute to TRUE sns_extended_client.subscribe(TopicArn=demo_topic_arn, Protocol="sqs", Endpoint=demo_queue_arn, Attributes={"RawMessageDelivery":"true"}) sns_extended_client.large_payload_support = s3_extended_payload_bucket # To store all messages content in S3, set always_through_s3 to True # In the example, we set message size threshold as 32 bytes, adjust this threshold as per your usecase # Message will only be uploaded to S3 when its payload size exceeded threshold sns_extended_client.message_size_threshold = 32 sns_extended_client.publish( TopicArn=demo_topic_arn, Message="This message should be published to S3 as it exceeds the message_size_threshold limit", ) # Print message stored in s3 fetch_and_print_from_sqs(sqs, demo_queue_url)

Output

Published Message: [ "software.amazon.payloadoffloading.PayloadS3Pointer", { "s3BucketName": "extended-client-bucket-store", "s3Key": "xxxx-xxxxx-xxxxx-xxxxxx" } ] Message Stored in S3 Bucket is: This message should be published to S3 as it exceeds the message_size_threshold limit