Log Amazon SageMaker Events with Amazon CloudWatch
To help you debug your compilation jobs, processing jobs, training jobs, endpoints,
transform jobs, notebook instances, and notebook instance lifecycle configurations, anything
an algorithm container, a model container, or a notebook instance lifecycle configuration
sends to stdout
or stderr
is also sent to Amazon CloudWatch Logs. In addition to
debugging, you can use these for progress analysis.
Logs
The following table lists all of the logs provided by Amazon SageMaker.
Logs
Log Group Name | Log Stream Name |
---|---|
/aws/sagemaker/CompilationJobs |
|
/aws/sagemaker/Endpoints/[EndpointName] |
|
(For Asynchronous Inference endpoints) |
|
(For Inference Pipelines) |
|
/aws/sagemaker/groundtruth/WorkerActivity |
|
/aws/sagemaker/InferenceRecommendationsJobs |
|
|
|
|
|
/aws/sagemaker/LabelingJobs |
|
/aws/sagemaker/NotebookInstances |
|
|
|
/aws/sagemaker/ProcessingJobs |
|
/aws/sagemaker/studio |
|
|
|
/aws/sagemaker/TrainingJobs |
|
/aws/sagemaker/TransformJobs |
|
|
|
|
Note
1. The /aws/sagemaker/NotebookInstances/[LifecycleConfigHook]
log stream is
created when you create a notebook instance with a lifecycle configuration. For more
information, see Customize a Notebook Instance Using a
Lifecycle Configuration Script.
2. For Inference Pipelines, if you don't provide container names, the platform uses **container-1, container-2**, and so on, corresponding to the order provided in the SageMaker model.
For more information about logging events with CloudWatch logging, see What is Amazon CloudWatch Logs? in the Amazon CloudWatch User Guide.