NxD Training# NxD Training is a PyTorch library for end-to-end distributed training. NxD Training Overview & Setup NxD Training Overview Setup API Reference Guide YAML Configuration Settings Developer Guide Integrating a New Model Integrating a new dataset/dataloader Registering an optimizer and LR scheduler Migrating from Neuron-NeMo-Megatron to Neuronx Distributed Training Migrating from NeMo to Neuronx Distributed Training Tutorials Megatron GPT Pretraining HuggingFace Llama3-8B Pretraining HuggingFace Llama3-8B Supervised Fine-tuning Checkpoint Conversion App Notes Introducing NxD Training Tensor Parallelism Overview Pipeline Parallelism Overview Activation Memory Reduction Misc Known Issues and Workarounds