Getting started with Amazon Managed Service for Apache Flink (DataStream API) - Managed Service for Apache Flink
Services or capabilities described in Amazon Web Services documentation might vary by Region. To see the differences applicable to the China Regions, see Getting Started with Amazon Web Services in China (PDF).

Amazon Managed Service for Apache Flink was previously known as Amazon Kinesis Data Analytics for Apache Flink.

Getting started with Amazon Managed Service for Apache Flink (DataStream API)

This section introduces you to the fundamental concepts of Managed Service for Apache Flink and implementing an application in Java using the DataStream API. It describes the available options for creating and testing your applications. It also provides instructions for installing the necessary tools to complete the tutorials in this guide and to create your first application.

Review the components of the Managed Service for Apache Flink application


Amazon Managed Service for Apache Flink supports all Apache Flink APIs and potentially all JVM languages. For more information, see Flink's APIs.

Depending on the API you choose, the structure of the application and the implementation is slightly different. This Getting Started tutorial covers the implementation of the applications using the DataStream API in Java.

To process data, your Managed Service for Apache Flink application uses a Java application that processes input and produces output using the Apache Flink runtime.

A typical Managed Service for Apache Flink application has the following components:

  • Runtime properties: You can use runtime properties to pass configuration parameters to your application to change them without modifying and republishing the code.

  • Source(s): The application consumes data from one or more sources. A source uses a connector to read data from an external system, such as a Kinesis data stream, or a Kafka bucket. For more information, see Sources.

  • Operators: The application processes data by using one or more operators. An operator can transform, enrich, or aggregate data. For more information, see DataStream API operators.

  • Sink: The application sends data to external sources through sinks. A sink uses a connectorv to send data to a Kinesis data stream, a Kafka topic, Amazon S3, or a relational database. You can also use a special connector to print the the ouput for development purposes only. For more information, see Sinks.

Your application requires some external dependencies, such as the Flink connectors that your application uses, or potentially a Java library. To run in Amazon Managed Service for Apache Flink, the application must be packaged along with dependencies in a fat-jarv and uploaded to an Amazon S3 bucket. You then create a Managed Service for Apache Flink application. You pass the location of the code package, along with any other runtime configuration parameter.

This tutorial demonstrates how to use Apache Maven to package the application, and how to run the application locally in the IDE of your choice.

Fulfill the prerequisites for completing the exercises

To complete the steps in this guide, you must have the following:

  • Git client. Install the Git client, if you haven't already.

  • Java Development Kit (JDK) version 11 . Install a Java JDK 11 and set the JAVA_HOME environment variable to point to your JDK install location. If you don't have a JDK 11, you can use Amazon Coretto 11 or any other standard JDK of your choice.

    • To verify that you have the JDK installed correctly, run the following command. The output will be different if your are using a JDK other than Amazon Coretto. Make sure that the version is 11.x.

      $ java --version openjdk 11.0.23 2024-04-16 LTS OpenJDK Runtime Environment Corretto- (build 11.0.23+9-LTS) OpenJDK 64-Bit Server VM Corretto- (build 11.0.23+9-LTS, mixed mode)
  • Apache Maven. Install Apache Maven if you haven't already. To learn how to install it, see Installing Apache Maven.

    • To test your Apache Maven installation, enter the following:

    $ mvn -version
  • IDE for local develoment. We recommend that you use a development environment such as Eclipse Java Neon or IntelliJ IDEA to develop and compile your application.

    • To test your Apache Maven installation, enter the following:

    $ mvn -version

To get started, go to Step 1: Set up an Amazon account and create an administrator user.