This document is relevant for: Trn2, Trn3
nki.collectives#
The nki.collectives module provides APIs for multi-core collective communication
operations such as all-reduce and all-gather across NeuronCores.
NKI Collectives#
Collective operations for multi-rank communication.
Perform an all-reduce on the given replica group and input/output tensors. |
|
Perform an all-gather on the given replica group and input/output tensors. |
|
Perform a reduce-scatter on the given replica group and input/output tensors. |
|
Perform an all-to-all on the given replica group and input/output tensors. |
|
Perform a variable-length all-to-all on the given replica group and input/output tensors. |
|
Send and receive data between ranks based on explicitly defined source-target pairs. |
|
Send and receive data between ranks in a ring, where sources and destinations are implicitly determined by the ring structure during runtime. |
|
Perform an implicit collective permute with reduction in a ring, where sources and destinations are implicitly determined by the ring structure during runtime. |
|
Returns the rank ID of the data to be processed in the current ring iteration. |
|
Get the rank ID of the current rank. |
Constants#
Defines a group of ranks that participate in a collective operation. |
This document is relevant for: Trn2, Trn3