Related Information - Deep Learning AMI
Services or capabilities described in Amazon Web Services documentation might vary by Region. To see the differences applicable to the China Regions, see Getting Started with Amazon Web Services in China (PDF).

Related Information

Forums

Related Blog Posts

FAQ

  • Q. How do I keep track of product announcements related to DLAMI?

    Here are two suggestions for this:

  • Q. Are the NVIDIA drivers and CUDA installed?

    Yes. Some DLAMIs have different versions. The Deep Learning AMI with Conda has the most recent versions of any DLAMI. This is covered in more detail in CUDA Installations and Framework Bindings. You can also refer to the specific AMI's release notes to confirm what is installed.

  • Q. Is cuDNN installed?

    Yes.

  • Q. How do I see that the GPUs are detected and their current status?

    Run nvidia-smi. This will show one or more GPUs, depending on the instance type, along with their current memory consumption.

  • Q. Are virtual environments set up for me?

    Yes, but only on the Deep Learning AMI with Conda.

  • Q. What version of Python is installed?

    Each DLAMI has both Python 2 and 3. The Deep Learning AMI with Conda have environments for both versions for each framework.

  • Q. Is Keras installed?

    This depends on the AMI. The Deep Learning AMI with Conda has Keras available as a front end for each framework. The version of Keras depends on the framework's support for it.

  • Q. Is it free?

    All of the DLAMIs are free. However, depending on the instance type you choose, the instance may not be free. See Pricing for the DLAMI for more info.

  • Q. I'm getting CUDA errors or GPU-related messages from my framework. What's wrong?

    Check what instance type you used. It needs to have a GPU for many examples and tutorials to work. If running nvidia-smi shows no GPU, then you need to spin up another DLAMI using an instance with one or more GPUs. See Selecting the Instance Type for DLAMI for more info.

  • Q. Can I use Docker?

    Docker has been pre-installed since version 14 of the Deep Learning AMI with Conda. Note that you will want to use nvidia-docker on GPU instances to make use of the GPU.

  • Q. What regions are Linux DLAMIs available in?

    Region Code
    US East (Ohio) us-east-2
    US East (N. Virginia) us-east-1
    US West (N. California) us-west-1
    US West (Oregon) us-west-2
    Beijing (China) cn-north-1
    Ningxia (China) cn-northwest-1
    Asia Pacific (Mumbai) ap-south-1
    Asia Pacific (Seoul) ap-northeast-2
    Asia Pacific (Singapore) ap-southeast-1
    Asia Pacific (Sydney) ap-southeast-2
    Asia Pacific (Tokyo) ap-northeast-1
    Canada (Central) ca-central-1
    EU (Frankfurt) eu-central-1
    EU (Ireland) eu-west-1
    EU (London) eu-west-2
    EU (Paris) eu-west-3
    SA (Sao Paulo) sa-east-1
  • Q. What regions are Windows DLAMIs available in?

    Region Code
    US East (Ohio) us-east-2
    US East (N. Virginia) us-east-1
    US West (N. California) us-west-1
    US West (Oregon) us-west-2
    Beijing (China) cn-north-1
    Asia Pacific (Mumbai) ap-south-1
    Asia Pacific (Seoul) ap-northeast-2
    Asia Pacific (Singapore) ap-southeast-1
    Asia Pacific (Sydney) ap-southeast-2
    Asia Pacific (Tokyo) ap-northeast-1
    Canada (Central) ca-central-1
    EU (Frankfurt) eu-central-1
    EU (Ireland) eu-west-1
    EU (London) eu-west-2
    EU (Paris) eu-west-3
    SA (Sao Paulo) sa-east-1