Perform Automatic Model Tuning with SageMaker
Amazon SageMaker automatic model tuning (AMT), also known as hyperparameter tuning, finds the best version of a model by running many training jobs on your dataset. To do this, AMT uses the algorithm and ranges of hyperparameters that you specify. It then chooses the hyperparameter values that creates a model that performs the best, as measured by a metric that you choose.
For example, suppose that you want to solve a binary
classification problem on a marketing dataset. Your goal is to maximize the
area under the curve (AUC)
metric of the algorithm by training an XGBoost Algorithm model.
You want to find which values for the eta
, alpha
,
min_child_weight
, and max_depth
hyperparameters that will train the
best model. Specify a range of values for these hyperparameters. Then, SageMaker hyperparameter
tuning searches within these ranges to find a combination of values that creates a training job
that creates a model with the highest AUC. To conserve resources or meet a specific model
quality expectation, you can also set up completion criteria to stop tuning after the criteria
have been met.
You can use SageMaker AMT with built-in algorithms, custom algorithms, or SageMaker pre-built containers for machine learning frameworks.
SageMaker AMT can use an Amazon EC2 Spot instance to optimize costs when running training jobs. For more information, see Managed Spot Training in Amazon SageMaker.
Before you start using hyperparameter tuning, you should have a well-defined machine learning problem, including the following:
-
A dataset
-
An understanding of the type of algorithm that you need to train
-
A clear understanding of how you measure success
Prepare your dataset and algorithm so that they work in SageMaker and successfully run a training job at least once. For information about setting up and running a training job, see Get Started with Amazon SageMaker.
Topics
- How Hyperparameter Tuning Works
- Define metrics and environment variables
- Define Hyperparameter Ranges
- Track and set completion criteria for your tuning job
- Tune Multiple Algorithms with Hyperparameter Optimization to Find the Best Model
- Example: Hyperparameter Tuning Job
- Stop Training Jobs Early
- Run a Warm Start Hyperparameter Tuning Job
- Resource Limits for Automatic Model Tuning
- Best Practices for Hyperparameter Tuning