This CUDA extension implements optimized cross-entropy loss, adapted from Apex's Xentropy. We make it work for bfloat16 and support in-place backward to save memory.
It has only been tested on A100s.
cd csrc/xentropy && pip install .
As of 2023-09-15, this extension is no longer used in the FlashAttention repo. We've instead switched to a Triton-based implementation. See the CrossEntropyLoss module for more details.