Inference queries in Neptune ML - Amazon Neptune
Services or capabilities described in Amazon Web Services documentation might vary by Region. To see the differences applicable to the China Regions, see Getting Started with Amazon Web Services in China (PDF).

Inference queries in Neptune ML

You can use either Gremlin or SPARQL to query a Neptune ML inference endpoint. Real-time inductive inference, however, is currently only supported for Gremlin queries.