@Generated(value="com.amazonaws:aws-java-sdk-code-generator") public class TextGenerationJobConfig extends Object implements Serializable, Cloneable, StructuredPojo
The collection of settings used by an AutoML job V2 for the text generation problem type.
The text generation models that support fine-tuning in Autopilot are currently accessible exclusively in regions supported by Canvas. Refer to the documentation of Canvas for the full list of its supported Regions.
Constructor and Description |
---|
TextGenerationJobConfig() |
Modifier and Type | Method and Description |
---|---|
TextGenerationJobConfig |
addTextGenerationHyperParametersEntry(String key,
String value)
Add a single TextGenerationHyperParameters entry
|
TextGenerationJobConfig |
clearTextGenerationHyperParametersEntries()
Removes all the entries added into TextGenerationHyperParameters.
|
TextGenerationJobConfig |
clone() |
boolean |
equals(Object obj) |
String |
getBaseModelName()
The name of the base model to fine-tune.
|
AutoMLJobCompletionCriteria |
getCompletionCriteria()
How long a fine-tuning job is allowed to run.
|
ModelAccessConfig |
getModelAccessConfig() |
Map<String,String> |
getTextGenerationHyperParameters()
The hyperparameters used to configure and optimize the learning process of the base model.
|
int |
hashCode() |
void |
marshall(ProtocolMarshaller protocolMarshaller)
Marshalls this structured data using the given
ProtocolMarshaller . |
void |
setBaseModelName(String baseModelName)
The name of the base model to fine-tune.
|
void |
setCompletionCriteria(AutoMLJobCompletionCriteria completionCriteria)
How long a fine-tuning job is allowed to run.
|
void |
setModelAccessConfig(ModelAccessConfig modelAccessConfig) |
void |
setTextGenerationHyperParameters(Map<String,String> textGenerationHyperParameters)
The hyperparameters used to configure and optimize the learning process of the base model.
|
String |
toString()
Returns a string representation of this object.
|
TextGenerationJobConfig |
withBaseModelName(String baseModelName)
The name of the base model to fine-tune.
|
TextGenerationJobConfig |
withCompletionCriteria(AutoMLJobCompletionCriteria completionCriteria)
How long a fine-tuning job is allowed to run.
|
TextGenerationJobConfig |
withModelAccessConfig(ModelAccessConfig modelAccessConfig) |
TextGenerationJobConfig |
withTextGenerationHyperParameters(Map<String,String> textGenerationHyperParameters)
The hyperparameters used to configure and optimize the learning process of the base model.
|
public void setCompletionCriteria(AutoMLJobCompletionCriteria completionCriteria)
How long a fine-tuning job is allowed to run. For TextGenerationJobConfig
problem types, the
MaxRuntimePerTrainingJobInSeconds
attribute of AutoMLJobCompletionCriteria
defaults to
72h (259200s).
completionCriteria
- How long a fine-tuning job is allowed to run. For TextGenerationJobConfig
problem types, the
MaxRuntimePerTrainingJobInSeconds
attribute of AutoMLJobCompletionCriteria
defaults to 72h (259200s).public AutoMLJobCompletionCriteria getCompletionCriteria()
How long a fine-tuning job is allowed to run. For TextGenerationJobConfig
problem types, the
MaxRuntimePerTrainingJobInSeconds
attribute of AutoMLJobCompletionCriteria
defaults to
72h (259200s).
TextGenerationJobConfig
problem types, the
MaxRuntimePerTrainingJobInSeconds
attribute of AutoMLJobCompletionCriteria
defaults to 72h (259200s).public TextGenerationJobConfig withCompletionCriteria(AutoMLJobCompletionCriteria completionCriteria)
How long a fine-tuning job is allowed to run. For TextGenerationJobConfig
problem types, the
MaxRuntimePerTrainingJobInSeconds
attribute of AutoMLJobCompletionCriteria
defaults to
72h (259200s).
completionCriteria
- How long a fine-tuning job is allowed to run. For TextGenerationJobConfig
problem types, the
MaxRuntimePerTrainingJobInSeconds
attribute of AutoMLJobCompletionCriteria
defaults to 72h (259200s).public void setBaseModelName(String baseModelName)
The name of the base model to fine-tune. Autopilot supports fine-tuning a variety of large language models. For
information on the list of supported models, see Text generation models supporting fine-tuning in Autopilot. If no BaseModelName
is provided,
the default model used is Falcon7BInstruct.
baseModelName
- The name of the base model to fine-tune. Autopilot supports fine-tuning a variety of large language
models. For information on the list of supported models, see Text generation models supporting fine-tuning in Autopilot. If no BaseModelName
is
provided, the default model used is Falcon7BInstruct.public String getBaseModelName()
The name of the base model to fine-tune. Autopilot supports fine-tuning a variety of large language models. For
information on the list of supported models, see Text generation models supporting fine-tuning in Autopilot. If no BaseModelName
is provided,
the default model used is Falcon7BInstruct.
BaseModelName
is
provided, the default model used is Falcon7BInstruct.public TextGenerationJobConfig withBaseModelName(String baseModelName)
The name of the base model to fine-tune. Autopilot supports fine-tuning a variety of large language models. For
information on the list of supported models, see Text generation models supporting fine-tuning in Autopilot. If no BaseModelName
is provided,
the default model used is Falcon7BInstruct.
baseModelName
- The name of the base model to fine-tune. Autopilot supports fine-tuning a variety of large language
models. For information on the list of supported models, see Text generation models supporting fine-tuning in Autopilot. If no BaseModelName
is
provided, the default model used is Falcon7BInstruct.public Map<String,String> getTextGenerationHyperParameters()
The hyperparameters used to configure and optimize the learning process of the base model. You can set any combination of the following hyperparameters for all base models. For more information on each supported hyperparameter, see Optimize the learning process of your text generation models with hyperparameters.
"epochCount"
: The number of times the model goes through the entire training dataset. Its value
should be a string containing an integer value within the range of "1" to "10".
"batchSize"
: The number of data samples used in each iteration of training. Its value should be a
string containing an integer value within the range of "1" to "64".
"learningRate"
: The step size at which a model's parameters are updated during training. Its value
should be a string containing a floating-point value within the range of "0" to "1".
"learningRateWarmupSteps"
: The number of training steps during which the learning rate gradually
increases before reaching its target or maximum value. Its value should be a string containing an integer value
within the range of "0" to "250".
Here is an example where all four hyperparameters are configured.
{ "epochCount":"5", "learningRate":"0.5", "batchSize": "32", "learningRateWarmupSteps": "10" }
"epochCount"
: The number of times the model goes through the entire training dataset. Its
value should be a string containing an integer value within the range of "1" to "10".
"batchSize"
: The number of data samples used in each iteration of training. Its value should
be a string containing an integer value within the range of "1" to "64".
"learningRate"
: The step size at which a model's parameters are updated during training. Its
value should be a string containing a floating-point value within the range of "0" to "1".
"learningRateWarmupSteps"
: The number of training steps during which the learning rate
gradually increases before reaching its target or maximum value. Its value should be a string containing
an integer value within the range of "0" to "250".
Here is an example where all four hyperparameters are configured.
{ "epochCount":"5", "learningRate":"0.5", "batchSize": "32", "learningRateWarmupSteps": "10" }
public void setTextGenerationHyperParameters(Map<String,String> textGenerationHyperParameters)
The hyperparameters used to configure and optimize the learning process of the base model. You can set any combination of the following hyperparameters for all base models. For more information on each supported hyperparameter, see Optimize the learning process of your text generation models with hyperparameters.
"epochCount"
: The number of times the model goes through the entire training dataset. Its value
should be a string containing an integer value within the range of "1" to "10".
"batchSize"
: The number of data samples used in each iteration of training. Its value should be a
string containing an integer value within the range of "1" to "64".
"learningRate"
: The step size at which a model's parameters are updated during training. Its value
should be a string containing a floating-point value within the range of "0" to "1".
"learningRateWarmupSteps"
: The number of training steps during which the learning rate gradually
increases before reaching its target or maximum value. Its value should be a string containing an integer value
within the range of "0" to "250".
Here is an example where all four hyperparameters are configured.
{ "epochCount":"5", "learningRate":"0.5", "batchSize": "32", "learningRateWarmupSteps": "10" }
textGenerationHyperParameters
- The hyperparameters used to configure and optimize the learning process of the base model. You can set any
combination of the following hyperparameters for all base models. For more information on each supported
hyperparameter, see Optimize the learning process of your text generation models with hyperparameters.
"epochCount"
: The number of times the model goes through the entire training dataset. Its
value should be a string containing an integer value within the range of "1" to "10".
"batchSize"
: The number of data samples used in each iteration of training. Its value should
be a string containing an integer value within the range of "1" to "64".
"learningRate"
: The step size at which a model's parameters are updated during training. Its
value should be a string containing a floating-point value within the range of "0" to "1".
"learningRateWarmupSteps"
: The number of training steps during which the learning rate
gradually increases before reaching its target or maximum value. Its value should be a string containing
an integer value within the range of "0" to "250".
Here is an example where all four hyperparameters are configured.
{ "epochCount":"5", "learningRate":"0.5", "batchSize": "32", "learningRateWarmupSteps": "10" }
public TextGenerationJobConfig withTextGenerationHyperParameters(Map<String,String> textGenerationHyperParameters)
The hyperparameters used to configure and optimize the learning process of the base model. You can set any combination of the following hyperparameters for all base models. For more information on each supported hyperparameter, see Optimize the learning process of your text generation models with hyperparameters.
"epochCount"
: The number of times the model goes through the entire training dataset. Its value
should be a string containing an integer value within the range of "1" to "10".
"batchSize"
: The number of data samples used in each iteration of training. Its value should be a
string containing an integer value within the range of "1" to "64".
"learningRate"
: The step size at which a model's parameters are updated during training. Its value
should be a string containing a floating-point value within the range of "0" to "1".
"learningRateWarmupSteps"
: The number of training steps during which the learning rate gradually
increases before reaching its target or maximum value. Its value should be a string containing an integer value
within the range of "0" to "250".
Here is an example where all four hyperparameters are configured.
{ "epochCount":"5", "learningRate":"0.5", "batchSize": "32", "learningRateWarmupSteps": "10" }
textGenerationHyperParameters
- The hyperparameters used to configure and optimize the learning process of the base model. You can set any
combination of the following hyperparameters for all base models. For more information on each supported
hyperparameter, see Optimize the learning process of your text generation models with hyperparameters.
"epochCount"
: The number of times the model goes through the entire training dataset. Its
value should be a string containing an integer value within the range of "1" to "10".
"batchSize"
: The number of data samples used in each iteration of training. Its value should
be a string containing an integer value within the range of "1" to "64".
"learningRate"
: The step size at which a model's parameters are updated during training. Its
value should be a string containing a floating-point value within the range of "0" to "1".
"learningRateWarmupSteps"
: The number of training steps during which the learning rate
gradually increases before reaching its target or maximum value. Its value should be a string containing
an integer value within the range of "0" to "250".
Here is an example where all four hyperparameters are configured.
{ "epochCount":"5", "learningRate":"0.5", "batchSize": "32", "learningRateWarmupSteps": "10" }
public TextGenerationJobConfig addTextGenerationHyperParametersEntry(String key, String value)
public TextGenerationJobConfig clearTextGenerationHyperParametersEntries()
public void setModelAccessConfig(ModelAccessConfig modelAccessConfig)
modelAccessConfig
- public ModelAccessConfig getModelAccessConfig()
public TextGenerationJobConfig withModelAccessConfig(ModelAccessConfig modelAccessConfig)
modelAccessConfig
- public String toString()
toString
in class Object
Object.toString()
public TextGenerationJobConfig clone()
public void marshall(ProtocolMarshaller protocolMarshaller)
StructuredPojo
ProtocolMarshaller
.marshall
in interface StructuredPojo
protocolMarshaller
- Implementation of ProtocolMarshaller
used to marshall this object's data.