Error messages - Amazon Personalize
Services or capabilities described in Amazon Web Services documentation might vary by Region. To see the differences applicable to the China Regions, see Getting Started with Amazon Web Services in China (PDF).

Error messages

The following sections list and explain some of the messages that you might encounter when using Amazon Personalize.

Data import and management

Error message: Invalid Data location.

Make sure you used the correct syntax for your Amazon S3 bucket location. For dataset import jobs, use the following syntax for the location of your data in Amazon S3:

s3://<name of your S3 bucket>/<folder path>/<CSVfilename>

If your CSV files are in a folder and you want to upload multiple files with one dataset import job, use this syntax without the CSV file name.

Error message: An error occurred (LimitExceededException) when calling the CreateDatasetImportJob operation: More than 5 resources with PENDING or IN_PROGRESS status.

You can have a total of 5 pending or in progress dataset import jobs per region. This quota is not adjustable. For a complete list of quotas for Amazon Personalize, see Amazon Personalize endpoints and quotas.

Error message: Failed to create a data import job for <dataset type> dataset....Insufficient privileges for accessing data in Amazon S3.

Give Amazon Personalize access to your Amazon S3 resources by attaching access policies to your Amazon S3 bucket and your Amazon Personalize service role. See Giving Amazon Personalize access to Amazon S3 resources.

If you use Amazon Key Management Service (Amazon KMS) for encryption, you must grant Amazon Personalize and your Amazon Personalize IAM service role permission to use your key. For more information, see Giving Amazon Personalize permission to use your Amazon KMS key.

Error message: Failed to create a data import job <dataset type> dataset...Input CSV is missing the following columns:[COLUMN_NAME, COLUMN_NAME].

The data that you import into Amazon Personalize, including attribute names and data types, must match the destination dataset's schema. For more information, see Schemas.

Error message: Length cannot be more than <character limit> characters for <COLLUMN_NAME>. If no values exceed the character limit, make sure your data follows the formatting guidelines listed in https://docs.aws.amazon.com/personalize/latest/dg/data-prep-formatting.html.

Check to make sure all values in this column don't exceed the character limit. If no values exceed the character limit, check any preceding textual fields for the following:

  • Make sure any textual data is wrapped in double quotes. Use the \ character to escape any double quotes or \ characters in your data.

  • Makes sure each record in your CSV file is on a single line.

Creating a solution and solution version (custom resources)

Error message: Create failed. Dataset has fewer than 25 users with at least 2 interactions each.

You must import more data before you can train the model. The minimum data requirements to train a model are:

  • At minimum 1000 item interactions records from users interacting with items in your catalog. These interactions can be from bulk imports, or streamed events, or both.

  • At minimum 25 unique user IDs with at least two item interactions for each.

For real-time recommendations, import more data with a dataset import job or record more interaction events for your users with an event tracker and the PutEvents operation. For more information on recording real-time events, see Recording events.

For batch recommendations, import your data with a dataset import job when you have more data. For more information, about importing bulk data see Step 2: Preparing and importing data.

Model deployment (custom campaigns)

Error: Cannot create a campaign. More than 5 resources in ACTIVE state. Please delete some and try again.

You can have a total of 5 active Amazon Personalize campaigns per dataset group. This quota is adjustable and you can request a quota increase using the Service Quotas console. For a complete list of limits and quotas for Amazon Personalize, see Amazon Personalize endpoints and quotas.

Recommenders (Domain dataset groups)

Error: Dataset has fewer than 1000 interactions after filtering by event type: <event type>

Different use cases require different event types. Your data must have at minimum 1000 events with the required type for your use case. For more information, see Choosing a use case

Recommendations

Batch inference job error message: Invalid S3 input path or Invalid S3 output path

Make sure you use the correct syntax for your Amazon S3 input or output locations. Also make sure that your output location is different from your input data. It should be a folder in the same Amazon S3 bucket or a different bucket.

Use the following syntax for the input file location in Amazon S3: s3://<name of your S3 bucket>/<folder name>/<input JSON file name>

Use the following syntax for the output folder in Amazon S3: s3://<name of your S3 bucket>/<output folder name>/

Filtering recommendations

Error message: Could not create filter. Invalid input symbol: $parameterName. Placeholders are not allowed with NOT_IN operator.

You can't use placeholder parameters in a filter expression that uses the NOT_IN operator. Instead, use the IN operator and use the opposite Action: use Include instead of Exclude (or the reverse).

For example, if you want to use INCLUDE ItemID WHERE Items.GENRE NOT IN ($GENRE), you can use EXCLUDE ItemID WHERE Items.GENRE IN ($GENRE) and get the same results.

For more information about filters, see Filter expression elements.

Error message: Could not create filter. Invalid Expression... when filtering on Boolean type fields

You can't create filter expressions that filter using values with a Boolean type in your schema. To filter based on Boolean values, use a schema with a field of type String and use the values True and False in your data. Or you can use type int or long and values 0 and 1.

For more information about filters, see Filter expression elements.