Step 5: Produce and consume data - Amazon Managed Streaming for Apache Kafka
Services or capabilities described in Amazon Web Services documentation might vary by Region. To see the differences applicable to the China Regions, see Getting Started with Amazon Web Services in China (PDF).

Step 5: Produce and consume data

In this step of Get Started Using Amazon MSK, you produce and consume data.

To produce and consume messages
  1. Run the following command to start a console producer.

    $KAFKA_ROOT/bin/kafka-console-producer.sh --broker-list $BOOTSTRAP_SERVER --producer.config $KAFKA_ROOT/config/client.properties --topic MSKTutorialTopic
  2. Enter any message that you want, and press Enter. Repeat this step two or three times. Every time you enter a line and press Enter, that line is sent to your Apache Kafka cluster as a separate message.

  3. Keep the connection to the client machine open, and then open a second, separate connection to that machine in a new window. Because this is a new session, set the KAFKA_ROOT and BOOTSTRAP_SERVER environment variables again. For information about how to set these environment variables, see Creating a topic on the client machine.

  4. Run the following command with your second connection string to the client machine to create a console consumer.

    $KAFKA_ROOT/bin/kafka-console-consumer.sh --bootstrap-server $BOOTSTRAP_SERVER --consumer.config $KAFKA_ROOT/config/client.properties --topic MSKTutorialTopic --from-beginning

    You should start seeing the messages you entered earlier when you used the console producer command.

  5. Enter more messages in the producer window, and watch them appear in the consumer window.

Next Step

Step 6: Use Amazon CloudWatch to view Amazon MSK metrics