The Amazon Inferentia Chip With DLAMI - Deep Learning AMI
Services or capabilities described in Amazon Web Services documentation might vary by Region. To see the differences applicable to the China Regions, see Getting Started with Amazon Web Services in China (PDF).

The Amazon Inferentia Chip With DLAMI

Amazon Inferentia is a custom machine learning chip designed by Amazon that you can use for high-performance inference predictions. In order to use the chip, set up an Amazon Elastic Compute Cloud instance and use the Amazon Neuron software development kit (SDK) to invoke the Inferentia chip. To provide customers with the best Inferentia experience, Neuron has been built into the Amazon Deep Learning AMI (DLAMI).

The following topics show you how to get started using Inferentia with the DLAMI.