This document is relevant for: Inf1, Inf2, Trn1, Trn2, Trn3
On December 19, 2025, AWS Neuron released the 2.27.0 version of the Neuron SDK.
This page provides detailed component release notes for the Neuron SDK 2.27.0. For a an overview of the release content, see What’s New in AWS Neuron.
Neuron 2.27.0 Component Release Notes#
Select a card below to review detailed release notes for each component of the Neuron SDK version 2.27.0. These component release notes contain details on specific new and improved features, as well as breaking changes, bug fixes, and known issues for that component area of the Neuron SDK.
NxD Core and NxD Training Updates for 2.27#
Neuron support for PyTorch 2.9 will be the last to include NeuronX Distributed Training (NxDT), NxD Core training APIs, and PyTorch/XLA for training. Starting with Neuron support for PyTorch 2.10, these components will no longer be supported.
Existing NxDT/NxD Core users should stay on PyTorch 2.9 until ready to migrate to native PyTorch on Neuron (starting PyTorch 2.10). Customers are recommended to use native PyTorch with standard distributed primitives (DTensor, FSDP, DDP) and TorchTitan starting with Neuron 2.28 and PyTorch 2.10. A migration guide will be published in Neuron 2.28.
Software maintenance announcements#
This section signals the official end-of-support or end of support for specific features, tools, and APIs. For the full set of Neuron release announcements, see Announcements.
Known issues: Samples#
When running the UNet training sample with the Neuron compiler, you may encounter this error: Estimated peak HBM usage exceeds 16GB.
To work around this error, include the function
conv_wrapin your model. (You can find a usable example of this function in the UNet sample model code.) Then, define a custom backward pass for your model following the instructions and example in the PyTorch documentation. The UNet sample also illustrates how this is done for the convolution layers in UNet.
Previous releases#
This document is relevant for: Inf1, Inf2, Trn1, Trn2, Trn3