1
0
Tri Dao 898dd4bbf2 Pass seqused_k to _flash_attn_varlen_forward 6 сар өмнө
..
layers 3566596ad8 Fix typo in RotaryEmbedding forward output type (#666) 1 жил өмнө
losses ec6d22143b [CrossEntropy] Change ignored_index -> ignore_index 9 сар өмнө
models 0d810cfb73 Fix KeyError handling for non-existing key in state_dict.pop() (#898) 7 сар өмнө
modules 6bbc532388 fix: cast the alibi slopes to torch.float32 (#846) 10 сар өмнө
ops 22339db185 remove an unused import (#960) 8 сар өмнө
utils 320fb59487 Update citation 8 сар өмнө
__init__.py 7551202cb2 Bump to v2.6.1 6 сар өмнө
bert_padding.py c94cd09744 Updated missing docstrings for args and returns in bert_padding.py (#795) 1 жил өмнө
flash_attn_interface.py 898dd4bbf2 Pass seqused_k to _flash_attn_varlen_forward 6 сар өмнө
flash_attn_triton.py f1a73d0740 Run isort and black on python files 1 жил өмнө
flash_attn_triton_og.py f1a73d0740 Run isort and black on python files 1 жил өмнө
flash_blocksparse_attention.py f1a73d0740 Run isort and black on python files 1 жил өмнө
flash_blocksparse_attn_interface.py f1a73d0740 Run isort and black on python files 1 жил өмнө
fused_softmax.py f1a73d0740 Run isort and black on python files 1 жил өмнө
pyproject.toml 73bd3f3bbb Move pyproject.toml to flash-attn and tests dir to avoid PEP 517 1 жил өмнө