Limitations and troubleshooting - Amazon SageMaker
Services or capabilities described in Amazon Web Services documentation might vary by Region. To see the differences applicable to the China Regions, see Getting Started with Amazon Web Services in China (PDF).

Limitations and troubleshooting

The following section outlines troubleshooting help and limitations that apply when using Amazon SageMaker Canvas. You can use these this topic to help troubleshoot any issues you encounter.

Troubleshooting issues with granting permissions through the SageMaker console

If you’re having trouble granting Canvas base permissions or Ready-to-use models permissions to your user, your user might have an Amazon IAM execution role with more than one trust relationship to other Amazon services. A trust relationship is a policy attached to your role that defines which principals (users, roles, accounts, or services) can assume the role. For example, you might encounter an issue granting additional Canvas permissions to your user if their execution role has a trust relationship to both Amazon SageMaker and Amazon Forecast.

You can fix this problem by choosing one of the following options.

1. Remove all but one trusted service from the role.

This solution requires you to edit the trust relationship for your user profile’s IAM role and remove all Amazon services except SageMaker.

To edit the trust relationship for your IAM execution role, do the following:

  1. Go to the IAM console at https://console.amazonaws.cn/iam/.

  2. In the navigation pane of the IAM console, choose Roles. The console displays the roles for your account.

  3. Choose the name of the role that you want to modify, and select the Trust relationships tab on the details page.

  4. Choose Edit trust policy.

  5. In the Edit trust policy editor, paste the following, and then choose Update Policy.

    { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": [ "sagemaker.amazonaws.com" ] }, "Action": "sts:AssumeRole" } ] }

You can also update this policy document using the IAM CLI. For more information, see update-trust in the IAM Command Line Reference.

You can now retry granting the Canvas base permissions or the Ready-to-use models permissions to your user.

2. Use a different role with one or fewer trusted services.

This solution requires you to specify a different IAM role for your user profile. Use this option if you already have an IAM role that you can substitute.

To specify a different execution role for your user, do the following:

  1. Open the Amazon SageMaker console at https://console.amazonaws.cn/sagemaker/.

  2. On the left navigation pane, choose Admin configurations.

  3. Under Admin configurations, choose domains.

  4. From the list of domains, select the domain that you want to view a list of user profiles for.

  5. On the domain details page, choose the User profiles tab.

  6. Choose the user whose permissions you want to edit. On the User details page, choose Edit.

  7. On the General settings page, choose the Execution role dropdown list and select the role that you want to use.

  8. Choose Submit to save your changes to the user profile.

Your user should now be using an execution role with only one trusted service (SageMaker).

You can retry granting the Canvas base permissions or the Ready-to-use models permissions to your user.

3. Manually attach the Amazon managed policy to the execution role instead of using the toggle in the SageMaker domain settings.

Instead of using the toggle in the domain or user profile settings, you can manually attach the Amazon managed policies that grant a user the correct permissions.

To grant a user Canvas base permissions, attach the AmazonSageMakerCanvasFullAccess policy. To grant a user Ready-to-use models permissions, attach the AmazonSageMakerCanvasAIServicesAccess policy.

Use the following procedure to attach an Amazon managed policy to your role:

  1. Go to the IAM console at https://console.amazonaws.cn/iam/.

  2. Choose Roles.

  3. In the search box, search for the user's IAM role by name and select it.

  4. On the page for the user's role, under Permissions, choose Add permissions.

  5. From the dropdown menu, choose Attach policies.

  6. Search for and select the policy or policies that you want to attach to the user’s execution role:

    1. To grant the Canvas base permissions, search for and select the AmazonSageMakerCanvasFullAccess policy.

    2. To grant the Ready-to-use models permissions, search for and select the AmazonSageMakerCanvasAIServicesAccess policy.

  7. Choose Add permissions to attach the policy to the role.

After attaching an Amazon managed policy to the user’s role through the IAM console, your user should now have the Canvas base permissions or Ready-to-use models permissions.

Limitations for collaboration

The following general limitations apply when you are collaborating with data scientists in Amazon SageMaker Studio Classic.

  • You can only share successfully trained models from Canvas to Studio Classic. Similarly, you can only share models that have been successfully trained in Studio Classic back to Canvas.

  • You can’t share Quick build models from Canvas to Studio Classic. You can only share Standard build models.

  • You can only share one version of a Standard build model trained in Canvas. You can train additional versions of your model within Canvas, but you can't share them to Studio Classic.

  • From Studio Classic, you can only share feedback or share an updated model with Canvas. You can’t perform both actions at the same time.

  • The length limitation for comments shared from Studio Classic to Canvas and Canvas to Studio Classic is 1024 characters.

  • You can only share your Canvas or Studio Classic models with a different user profile. You can’t share models between Canvas and Studio Classic within your own user profile.

  • You can't share from a Canvas user to a Canvas user, or from a Studio Classic user to a Studio Classic user.

There are also limitations that apply depending on the type of model you want to share. See the following sections for limitations on time series forecasting models and numeric and categorical prediction models.

Limitations for collaborating on time series forecasting models

The following limitations apply when you are collaborating on time series forecasting models between Canvas and Studio Classic.

  • You can’t make predictions with time series forecasting models in Studio Classic through an automated Share button. However, you can create a Jupyter notebook and write your own code.

  • For time series forecasting models, you can’t change the model recipe or data transformations in Studio Classic. You can only make the following updates to time series forecasting models in Studio Classic:

    • You can update the length of the forecast horizon.

    • You can update the item's metadata field, which groups your data by a certain column.

    • You can update other dimension fields, such as specifying a holiday schedule.

Limitations for collaborating on numeric and categorical prediction models

The following limitations apply when you are collaborating on numeric and categorical prediction model types between Canvas and Studio Classic.

  • When updating or training models in Studio Classic, if you close the tab with the collaboration banner at the top, it ends the share model workflow and you lose your progress. In that case, you must restart the share model workflow from the Shared With Me section on the Shared Models page. For more information, see Collaborate with data scientists.

  • When updating models in Studio Classic, you can’t change the target column if you want to share the model updates back to Canvas. If you want to change the target column and re-train the model, train the model and then use the Share button to share to Canvas. For more information about sharing a new model to Canvas, see Bring your own model to SageMaker Canvas.

  • When updating models in the Amazon SageMaker Data Wrangler Recipe interface in Studio Classic, there are limits to which changes a Studio Classic user can apply that Canvas supports:

    • You can only share a model to Canvas that has been trained from the last node in a Data Wrangler linear data flow.

    • Only transformation nodes are supported.

    • You can’t perform operations on the Target column.

    • You can’t update the data type of columns.

    • You can’t update the data source or add a new data source.

  • When sharing an alternative candidate to Canvas from the Studio Classic Autopilot page, you can’t select the model from the leaderboard. You must choose the shared model from the banner and then select an alternative from the list. For more information, see Share an alternate model with the Canvas user in the Canvas documentation.

  • Only models that are compatible with SageMaker Neo can be shared back to Canvas successfully. Compatible models are Autopilot models that use XGBoost or MLP algorithms. Incompatible models include Autopilot models that use the linear learner algorithm.

  • For custom formula transforms using Spark SQL, Canvas only supports Unary operations, Aggregate functions, the String concatenation operation and the Power operation. Other operations are not supported.

Limitations for bring your own model (BYOM)

The following general limitations apply when you want to bring your own model to SageMaker Canvas.

  • When a model is shared from Studio Classic to Canvas, the Canvas user cannot update or view details on the dataset that was used to build the model.

  • When a Canvas user wants to run a single prediction on an imported model, there are no data type restrictions when updating column values. You must manually make sure that when you update values for single predictions, you match the data type of the existing values.

  • When a Canvas user wants to run batch predictions on an imported model, Canvas assumes that you (the Canvas user) know what the expected input dataset should look like. You should have a dataset with columns and data types that match the dataset that was used to train the model. If not, consult with the user who shared the model with you and import a dataset that you can use for running batch predictions.

  • The Canvas application internally uses a serverless endpoint to run predictions and generate model metrics. The model shared to Canvas must be compatible with serverless endpoints:

    • The maximum memory size is 6144 MB.

    • When configuring the inference input response keys in your container, use the following configuration:

      INFERENCE_INPUT_RESPONSE_KEYS = { "BINARY": ["predicted_label", "probability"], "MULTI_CLASS": ["predicted_label", "probability", "probabilities", "labels"], }
    • You can choose either a SageMaker-provided inference container or bring your own image inference container to be used for endpoint. SageMaker provides containers for its built-in algorithms and prebuilt Docker images for some of the most common machine learning frameworks. If you are bringing your own container, you must modify it to work with SageMaker. For more information about bringing your own container, see Adapting Your Own Inference Container.

    • The Feature exclusions for serverless endpoints also apply.

  • To share a model from Studio Classic to Canvas successfully, Canvas accepts model inference outputs in the format below:

    TEXT/CSV

    • Regression: The model inference response should be a byte string where each of the output predictions are separated by \n:

      b'-0.0007884334772825241\n-0.015136942267417908\n0.050063662230968475\n0.02891816757619381\n'
    • Classification: The model inference response should be a byte string where each of predicted_label, predicted_probability, probabilities, and labels are separated by \n. The following example is for binary classification:

      b'no,0.9967488050460815,"[0.9967488050460815, 0.003251201706007123]","[\'no\', \'yes\']"\nno,0.9999420642852783,"[0.9999420642852783, 5.793538366560824e-05]","[\'no\', \'yes\']"\nno,0.9999846816062927,"[0.9999846816062927, 1.5326571883633733e-05]","[\'no\', \'yes\']"\nno,0.9999727606773376,"[0.9999727606773376, 2.7267418772680685e-05]","[\'no\', \'yes\']"\n'

      The following example is for multi-class classification:

      b'Iris-setosa,1.0,"[1.0, 0.0, 0.0]","[\'Iris-setosa\', \'Iris-versicolor\', \'Iris-virginica\']"\nIris-setosa,1.0,"[1.0, 0.0, 0.0]","[\'Iris-setosa\', \'Iris-versicolor\', \'Iris-virginica\']"\nIris-setosa,1.0,"[1.0, 0.0, 0.0]","[\'Iris-setosa\', \'Iris-versicolor\', \'Iris-virginica\']"\nIris-setosa,1.0,"[1.0, 0.0, 0.0]","[\'Iris-setosa\', \'Iris-versicolor\', \'Iris-virginica\']"\n'

    APPLICATION/JSON

    • Regression: The model inference response should be a JSON string which contains the prediction key, and its value should be the list of output predictions:

      let response = { "predictions": [ // First instance prediction. 1.75 // Second instance prediction. 3.25 ] }
    • Classification: The model inference response should be a JSON string which contains the probabilities key, and its value should be the list of probabilities.

      The following example is for binary classification:

      let response = { "probabilities": [ // First instance prediction. [0.9, 0.1] // Second instance prediction. [0.2, 0.8] ] }

      The following example is for multi-class classification:

      let response = { "probabilities": [ // First instance prediction. [0.7, 0.2, 0.1] // Second instance prediction. [0.2, 0.5, 0.3] ] }

There are also limitations that apply depending on the type of model you want to bring:

Bring your own model from SageMaker JumpStart

Review the following information and limits when sharing a SageMaker JumpStart model with Canvas.

  • The following are the supported algorithms for which you can import models into Canvas. For more details, see the SageMaker JumpStart documentation.

    • Tabular classification: LightGBM, CatBoost, XGBoost, AutoGluon-Tabular, TabTransformer, Linear Learner

    • Tabular regression: LightGBM, CatBoost, XGBoost, AutoGluon-Tabular, TabTransformer, Linear Learner

  • In SageMaker JumpStart, the Share button is only turned on if the model is ready to share to Canvas. If your trained model does not have a Share to SageMaker Canvas button, your model is not supported for BYOM.

  • You must provide training and validation datasets when training the SageMaker JumpStart model. The datasets should be stored in Amazon S3, and your Studio Classic and Canvas users' execution role must have access to the Amazon S3 location. You can use the same Amazon S3 URIs to share the training and validation datasets with Canvas, or you can share different datasets with the same data schema.

    Your training or validation data file should look like the following (in CSV format). You should index your files with the first column as the target.

    3 1 22 1 1 0 4 4 0 0 38 0 0 1 3 4 1 0 67 0 1 0 1 6 1 0 67 0 0 2 2 6 0 0 40 0 0 2 6 6 2 0 56 1 0 1 2 6
  • By default, SageMaker JumpStart uses the first column of the training and validation datasets as the target when training a model. The target column (or by default, the first column) of the datasets is shared to Canvas.

  • You must provide the column headers of the training and validation datasets when training the SageMaker JumpStart model. By default, SageMaker JumpStart only accepts datasets without column headers, so you must add the column headers as a file while training your model. The Amazon S3 URI for the column headers file is shared to Canvas as well. Your column headers file should look like the following example (in CSV format). The first column should be the target.

    Segmentation EverMarried Age Graduated WorkExperience SpendingScore FamilySize Var1
  • The training job in SageMaker JumpStart must be Complete before you can share with Canvas.

  • For classification problems (or categorical prediction in Canvas), original class names need to be provided in the Configure model output section when sharing to Canvas. The order of the class names must match the indexing used in the model. Your mapping relation file should look like the following example in CSV format, where index 0 (the first index) is mapped to the class name A:

    A B C D

    When the Canvas user views the model metrics in the Canvas application, they can only see the index of each class (0, 1, 2). However, the user can see the class names when viewing the results for a single prediction.

Bring your own model from Autopilot

Review the following information and limits when sharing a model from Autopilot to Canvas.

  • You can only share models to Canvas that you’ve successfully trained from an AutoML job with Ensembling, HPO, or Auto mode (for Auto mode, Autopilot chooses Ensembling or HPO mode based on the training dataset size). The currently supported Autopilot problem types are Regression, Multi-class classification, Binary classification.

  • For each Autopilot job, you can choose any model (the Best model or any other candidates) to share to Canvas one at a time. You only need to choose the Share model button and then specify the Canvas users with whom you’d like to share the model and a note.

  • AutoGluon-Tabular models that use Data Wrangler transformers for inference cannot be shared to Canvas. This is because Data Wrangler transformers cause the model to use more than one container.

  • HPO models that aren’t compatible with SageMaker Neo can’t be shared to Canvas successfully. Compatible models are Autopilot models that use XGBoost or MLP algorithms. Incompatible models include Autopilot models that use the linear learner algorithm.

Bring your own model from Model Registry

Review the following information and limits when sharing a model from Model Registry to Canvas.

  • Unlike the Share button provided by SageMaker JumpStart, Model Registry doesn’t provide model validation, so it’s possible that a registered model shared successfully from Studio Classic can fail while importing to Canvas due to model incompatibility. Review the following tips before sharing to Canvas from Model Registry:

    • Use a single inference container for your model. You can register models with multiple containers within the AdditionalInferenceSpecifications field, but Canvas is only optimized for one inference container per model. For example, when you use a inference pipeline and register multiple containers in the AdditionalInferenceSpecifications field with multiple data preprocessing containers and an inference container, by default the first container is selected for model inference in Canvas. Evaluate if this works for your use case if you're using machine learning pipelines.

    • Use a SageMaker built-in tabular algorithm with compatible inference formats. Tested sample algorithms with compatible inference outputs are Autogluon-Tabular, CatBoost, LightGBM, TabTransformer and XGBoost. Algorithms like Factorization Machines don't accept CSV as file input, and the inference output formats for algorithms like Linear Learner and K-NN are not supported by Canvas.

    • You can also bring your own image container and share to Canvas, or modify pre-built SageMaker containers.

  • When registering your model in a model package group, remember to provide the following attributes with your inference container:

    • Environment:

      "{\"SAGEMAKER_CONTAINER_LOG_LEVEL\": \"20\", \"SAGEMAKER_PROGRAM\": \"inference.py\", \"SAGEMAKER_REGION\": \"us-west-2\", \"SAGEMAKER_SUBMIT_DIRECTORY\": \"/opt/ml/model/code\"}"
    • Image:

      "s3://sagemaker-us-west-2-<account-id>/model-regression-abalone-2022-10-14-23-02-45/model.tar.gz"
    • ModelDataUrl

      "<account-id>.dkr.ecr.us-west-2.amazonaws.com/sagemaker-xgboost:1.3-1"
  • You must provide training and validation datasets when sharing the model from Model Registry to Canvas. The datasets should be stored in Amazon S3, and the Studio Classic and Canvas users' execution role must have access to the Amazon S3 location. You can use the same Amazon S3 URIs to share the training and validation datasets with Canvas, or you can share different datasets with the same data schema. The datasets must have the exact input formatting that feeds your model’s inference container.

  • You must provide the target column to Canvas, or the first column of your training/validation dataset is used by default.

  • In the Add model details section when sharing to Canvas, you can provide the first row your training and validation datasets as the headers, or you can specify the headers as a different file.

  • For classification problems (or categorical prediction in Canvas), original class names need to be provided when sharing to SageMaker Canvas through the Configure model outputs option. The order of the class names must match the indexing used with the shared model. The mapping can be either a CSV file in Amazon S3, or you can manually input the class names.