This document is relevant for: Inf1, Inf2, Trn1, Trn2

Neuron XLA pluggable device (libneuronxla) release notes#

libneuronxla is a software package containing Neuron’s integration into the PJRT runtime, built using the PJRT C-API plugin mechanism.

Release [2.0.5347.0]#

Date: 11/20/2024

Summary#

Add support for torch-xla 2.1.5 which fixes the “list index out of range” error when using the Zero Redundancy Optimizer (ZeRO1) checkpoint loading.

Release [2.0.4986.0]#

Date: 10/25/2024

Summary#

This patch release removes the excessive lock wait time during neuron_parallel_compile graph extraction for large cluster training.

Release [2.0.4115.0]#

Date: 09/16/2024

Summary#

This release of libneuronxla officially adds beta support for running JAX on AWS Trainium and Inferentia accelerators.

What’s new in this release#

Announcing beta Neuron support for JAX.

  • Trainium and Inferentia as PJRT pluggable devices

  • JAX 0.4.31 support (through PJRT C-API version 0.54)

This document is relevant for: Inf1, Inf2, Trn1, Trn2