Răsfoiți Sursa

Bump to v2.5.6

Tri Dao 1 an în urmă
părinte
comite
6c9e60de56
2 a modificat fișierele cu 3 adăugiri și 3 ștergeri
  1. 1 1
      flash_attn/__init__.py
  2. 2 2
      training/Dockerfile

+ 1 - 1
flash_attn/__init__.py

@@ -1,4 +1,4 @@
-__version__ = "2.5.5"
+__version__ = "2.5.6"
 
 from flash_attn.flash_attn_interface import (
     flash_attn_func,

+ 2 - 2
training/Dockerfile

@@ -85,7 +85,7 @@ RUN pip install transformers==4.25.1 datasets==2.8.0 pytorch-lightning==1.8.6 tr
 RUN pip install git+https://github.com/mlcommons/logging.git@2.1.0
 
 # Install FlashAttention
-RUN pip install flash-attn==2.5.5
+RUN pip install flash-attn==2.5.6
 
 # Install CUDA extensions for fused dense
-RUN pip install git+https://github.com/HazyResearch/flash-attention@v2.5.5#subdirectory=csrc/fused_dense_lib
+RUN pip install git+https://github.com/HazyResearch/flash-attention@v2.5.6#subdirectory=csrc/fused_dense_lib