This document is relevant for: Inf1, Inf2, Trn1, Trn2, Trn3
Get started with the Neuron multi-framework DLAMI#
The Neuron multi-framework Deep Learning AMI (DLAMI) provides a pre-configured environment with multiple frameworks and libraries ready to use. Each framework has its own virtual environment with all Neuron components pre-installed.
The multi-framework DLAMI supports Inf2, Trn1, Trn1n, Trn2, and Trn3 instances and is updated with each Neuron SDK release.
Step 1: Launch the instance#
Important
Currently, only Ubuntu 24.04 is supported for multi-framework DLAMIs.
Open the EC2 Console, select your desired AWS region, and choose “Launch Instance”. Under AMI selection, choose “Quick Start” then “Ubuntu”, and select Deep Learning AMI Neuron (Ubuntu 24.04).
Select your desired Neuron instance type (Inf2, Trn1, Trn1n, Trn2, or Trn3), configure disk size (minimum 512 GB for Trn instances), and launch the instance.
Note
To retrieve the latest DLAMI ID programmatically for automation flows, use SSM parameters.
Step 2: Activate a virtual environment#
The multi-framework DLAMI includes pre-configured virtual environments for each supported framework and library. Activate the one that matches your use case:
Find the virtual environment name for your framework or library in the Neuron DLAMI overview.
Activate the virtual environment:
source /opt/<name_of_virtual_environment>/bin/activate
Common virtual environments include:
Framework |
Virtual environment |
Use case |
|---|---|---|
PyTorch 2.9 |
|
Training and inference |
PyTorch vLLM |
|
LLM inference serving |
JAX |
|
Training and inference |
Note
Virtual environment names and available frameworks may vary by DLAMI version. See Virtual Environments pre-installed for the complete list.
Step 3: Verify and start#
After activating a virtual environment, verify the installation:
python3 -c "import torch; import torch_neuronx; print(f'PyTorch {torch.__version__}, torch-neuronx {torch_neuronx.__version__}')"
neuron-ls
You should see output similar to this (the framework, versions, instance IDs, and details should match your expected ones, not the ones in this example):
PyTorch 2.9.1+cu128, torch-neuronx 2.9.0.2.13.23887+8e870898
$ neuron-ls
instance-type: trn1.2xlarge
instance-id: i-0bea223b1afb7e159
+--------+--------+----------+--------+--------------+----------+------+
| NEURON | NEURON | NEURON | NEURON | PCI | CPU | NUMA |
| DEVICE | CORES | CORE IDS | MEMORY | BDF | AFFINITY | NODE |
+--------+--------+----------+--------+--------------+----------+------+
| 0 | 2 | 0-1 | 32 GB | 0000:00:1e.0 | 0-7 | -1 |
+--------+--------+----------+--------+--------------+----------+------+
python3 -c "import jax; print(f'JAX {jax.__version__}'); print(f'Devices: {jax.devices()}')"
neuron-ls
You should see output similar to this (the framework, versions, instance IDs, and details should match your expected ones, not the ones in this example):
JAX 0.6.2.1.0.1, torch-neuronx 2.9.0.2.13.23887+8e870898
$ neuron-ls
instance-type: trn1.2xlarge
instance-id: i-0bea223b1afb7e159
+--------+--------+----------+--------+--------------+----------+------+
| NEURON | NEURON | NEURON | NEURON | PCI | CPU | NUMA |
| DEVICE | CORES | CORE IDS | MEMORY | BDF | AFFINITY | NODE |
+--------+--------+----------+--------+--------------+----------+------+
| 0 | 2 | 0-1 | 32 GB | 0000:00:1e.0 | 0-7 | -1 |
+--------+--------+----------+--------+--------------+----------+------+
Next steps#
After setup, explore the framework documentation:
PyTorch Support on Neuron - PyTorch on Neuron
JAX Support on Neuron - JAX on Neuron
vLLM on Neuron - vLLM on Neuron
Neuron DLAMI User Guide - Full DLAMI documentation and SSM parameters
This document is relevant for: Inf1, Inf2, Trn1, Trn2, Trn3