.. |
layers
|
3566596ad8
Fix typo in RotaryEmbedding forward output type (#666)
|
1 year ago |
losses
|
ec6d22143b
[CrossEntropy] Change ignored_index -> ignore_index
|
7 months ago |
models
|
0d810cfb73
Fix KeyError handling for non-existing key in state_dict.pop() (#898)
|
5 months ago |
modules
|
6bbc532388
fix: cast the alibi slopes to torch.float32 (#846)
|
9 months ago |
ops
|
22339db185
remove an unused import (#960)
|
6 months ago |
utils
|
320fb59487
Update citation
|
6 months ago |
__init__.py
|
7551202cb2
Bump to v2.6.1
|
5 months ago |
bert_padding.py
|
c94cd09744
Updated missing docstrings for args and returns in bert_padding.py (#795)
|
10 months ago |
flash_attn_interface.py
|
898dd4bbf2
Pass seqused_k to _flash_attn_varlen_forward
|
5 months ago |
flash_attn_triton.py
|
f1a73d0740
Run isort and black on python files
|
1 year ago |
flash_attn_triton_og.py
|
f1a73d0740
Run isort and black on python files
|
1 year ago |
flash_blocksparse_attention.py
|
f1a73d0740
Run isort and black on python files
|
1 year ago |
flash_blocksparse_attn_interface.py
|
f1a73d0740
Run isort and black on python files
|
1 year ago |
fused_softmax.py
|
f1a73d0740
Run isort and black on python files
|
1 year ago |
pyproject.toml
|
73bd3f3bbb
Move pyproject.toml to flash-attn and tests dir to avoid PEP 517
|
1 year ago |