This website works better with JavaScript
Ana Sayfa
Keşfet
Yardım
Üye Ol
Giriş Yap
david
/
flash-attention
şunun yansıması
https://github.com/Dao-AILab/flash-attention
İzle
1
Yıldızla
0
Çatalla
0
Dosyalar
Sorunlar
0
Wiki
Dal:
changes_for_fp8
Dallar
Biçim İmleri
changes_for_fp8
decode
doc_masking
fa3-fp8-varlen
fa3-kvcache-gqa
ipiszy/local_attn
ipiszy/used_q
main
tdd
varlen
v2.7.2.post1
v2.7.2
v2.7.1.post4
v2.7.1.post3
v2.7.1.post2
v2.7.1.post1
v2.7.1
v2.7.0.post2
v2.7.0.post1
v2.7.0
v2.6.3
v2.6.2
v2.6.1
v2.6.0.post1
v2.6.0
v2.5.9.post1
v2.5.9
v2.5.8
v2.5.7
v2.5.6
v2.5.5
v2.5.4
v2.5.3
v2.5.2
v2.5.1.post1
v2.5.1
v2.5.0
v2.4.3.post1
v2.4.3
v2.4.2
v2.4.1
v2.4.0.post1
v2.4.0
v2.3.6
v2.3.5
v2.3.4
v2.3.3
v2.3.2
v2.3.1.post1
v2.3.1
v2.3.0
v2.2.5
v2.2.4.post1
v2.2.4
v2.2.3.post2
v2.2.3.post1
v2.2.3
v2.2.2
v2.2.1
v2.2.0
v2.1.2.post3
v2.1.2.post2
v2.1.2.post1
v2.1.2
v2.1.1
v2.1.0
v2.0.9
v2.0.8
v2.0.7
v2.0.6.post2
v2.0.6.post1
v2.0.6
v2.0.5
v2.0.4
v2.0.3
v2.0.2
v2.0.1
v2.0.0
v1.0.9
v1.0.8
v1.0.7
v1.0.6
v1.0.5
v1.0.4
v1.0.3.post0
v1.0.3
v1.0.2
v1.0.1
v1.0.0
v0.2.8
v0.2.7
v0.2.6
v0.2.5
v0.2.4
v0.2.3
v0.2.2
v0.2.1
flash-attention
/
flash_attn
/
modules
Markus Krimmel
6bbc532388
fix: cast the alibi slopes to torch.float32 (
#846
)
9 ay önce
..
__init__.py
ece539abd6
Add __init__.py files to subdirectories for installation
2 yıl önce
block.py
abbc131173
[LayerNorm] Switch from CUDA to Triton implementation
1 yıl önce
embedding.py
f1a73d0740
Run isort and black on python files
1 yıl önce
mha.py
6bbc532388
fix: cast the alibi slopes to torch.float32 (
#846
)
9 ay önce
mlp.py
c3b2196652
Add Alibi to MHA, test with Baichuan-13B
1 yıl önce