Services or capabilities described in Amazon Web Services documentation might vary by Region. To see the differences applicable to the China Regions,
see Getting Started with Amazon Web Services in China
(PDF).
Resources for using SparkML Serving with Amazon SageMaker AI
The Amazon SageMaker Python SDK SparkML Serving model and predictor and the Amazon SageMaker AI open-source
SparkML Serving container support deploying Apache Spark ML pipelines serialized
with MLeap in SageMaker AI to get inferences. Use the following resources to learn how to use SparkML Serving
with SageMaker AI.
For information about using the SparkML Serving container to deploy models to SageMaker AI, see
SageMaker Spark
ML Container GitHub repository. For information about the Amazon SageMaker Python SDK
SparkML Serving model and predictors, see the SparkML Serving
Model and Predictor API documentation.