Viewing Airflow logs in Amazon CloudWatch
Amazon MWAA can send Apache Airflow logs to Amazon CloudWatch. You can view logs for multiple environments from a single location to easily identify Apache Airflow task delays or workflow errors without the need for additional third-party tools. Apache Airflow logs need to be enabled on the Amazon Managed Workflows for Apache Airflow console to view Apache Airflow DAG processing, tasks, Web server, Worker logs in CloudWatch.
Contents
Pricing
-
Standard CloudWatch Logs charges apply. For more information, see CloudWatch pricing
.
Before you begin
-
You must have a role that can view logs in CloudWatch. For more information, see Accessing an Amazon MWAA environment.
Log types
Amazon MWAA creates a log group for each Airflow logging option you enable, and pushes the logs to the CloudWatch Logs groups associated with an environment. Log groups are named in the following format: YourEnvironmentName-
. For example, if your environment's named LogType
Airflow-v202-Public
, Apache Airflow task logs are sent to Airflow-v202-Public-
. Task
Log type | Description |
---|---|
|
The logs of the DAG processor manager (the part of the scheduler that processes DAG files). |
|
The logs the Airflow scheduler generates. |
|
The task logs a DAG generates. |
|
The logs the Airflow web interface generates. |
|
The logs generated as part of workflow and DAG execution. |
Enabling Apache Airflow logs
You can enable Apache Airflow logs at the INFO
, WARNING
, ERROR
, or CRITICAL
level. When you choose a log level, Amazon MWAA sends logs for that level and all higher levels of severity. For example, if you enable logs at the INFO
level, Amazon MWAA sends INFO
logs and WARNING
, ERROR
, and CRITICAL
log levels to CloudWatch Logs.
-
Open the Environments page
on the Amazon MWAA console. -
Choose an environment.
-
Choose Edit.
-
Choose Next.
-
Choose one or more of the following logging options:
-
Choose the Airflow scheduler log group on the Monitoring pane.
-
Choose the Airflow web server log group on the Monitoring pane.
-
Choose the Airflow worker log group on the Monitoring pane.
-
Choose the Airflow DAG processing log group on the Monitoring pane.
-
Choose the Airflow task log group on the Monitoring pane.
-
Choose the logging level in Log level.
-
-
Choose Next.
-
Choose Save.
Viewing Apache Airflow logs
The following section describes how to view Apache Airflow logs in the CloudWatch console.
-
Open the Environments page
on the Amazon MWAA console. -
Choose an environment.
-
Choose a log group in the Monitoring pane.
-
Choose a log in Log stream.
Example scheduler logs
You can view Apache Airflow logs for the Scheduler scheduling your workflows and parsing your dags
folder. The following steps describe how to open the log group for the Scheduler on the Amazon MWAA console, and view Apache Airflow logs on the CloudWatch Logs console.
To view logs for a requirements.txt
-
Open the Environments page
on the Amazon MWAA console. -
Choose an environment.
-
Choose the Airflow scheduler log group on the Monitoring pane.
-
Choose the
requirements_install_ip
log in Log streams. -
You should see the list of packages that were installed on the environment at
/usr/local/airflow/.local/bin
. For example:Collecting appdirs==1.4.4 (from -r /usr/local/airflow/.local/bin (line 1)) Downloading https://files.pythonhosted.org/packages/3b/00/2344469e2084fb28kjdsfiuyweb47389789vxbmnbjhsdgf5463acd6cf5e3db69324/appdirs-1.4.4-py2.py3-none-any.whl Collecting astroid==2.4.2 (from -r /usr/local/airflow/.local/bin (line 2))
-
Review the list of packages and whether any of these encountered an error during installation. If something went wrong, you may see an error similar to the following:
2021-03-05T14:34:42.731-07:00 No matching distribution found for LibraryName==1.0.0 (from -r /usr/local/airflow/.local/bin (line 4)) No matching distribution found for LibraryName==1.0.0 (from -r /usr/local/airflow/.local/bin (line 4))
What's next?
-
Learn how to configure a CloudWatch alarm in Using Amazon CloudWatch alarms.
-
Learn how to create a CloudWatch dashboard in Using CloudWatch dashboards.