This website works better with JavaScript
홈
탐색
도움말
가입하기
로그인
david
/
flash-attention
의 미러
https://github.com/Dao-AILab/flash-attention
Watch
1
Star
0
포크
0
파일
이슈
0
위키
브렌치:
tdd
브랜치
태그
decode
doc_masking
fp8-upcast-PV
ipiszy/local_attn
ipiszy/used_q
main
tdd
v2.7.4.post1
v2.7.4
v2.7.3
v2.7.2.post1
v2.7.2
v2.7.1.post4
v2.7.1.post3
v2.7.1.post2
v2.7.1.post1
v2.7.1
v2.7.0.post2
v2.7.0.post1
v2.7.0
v2.6.3
v2.6.2
v2.6.1
v2.6.0.post1
v2.6.0
v2.5.9.post1
v2.5.9
v2.5.8
v2.5.7
v2.5.6
v2.5.5
v2.5.4
v2.5.3
v2.5.2
v2.5.1.post1
v2.5.1
v2.5.0
v2.4.3.post1
v2.4.3
v2.4.2
v2.4.1
v2.4.0.post1
v2.4.0
v2.3.6
v2.3.5
v2.3.4
v2.3.3
v2.3.2
v2.3.1.post1
v2.3.1
v2.3.0
v2.2.5
v2.2.4.post1
v2.2.4
v2.2.3.post2
v2.2.3.post1
v2.2.3
v2.2.2
v2.2.1
v2.2.0
v2.1.2.post3
v2.1.2.post2
v2.1.2.post1
v2.1.2
v2.1.1
v2.1.0
v2.0.9
v2.0.8
v2.0.7
v2.0.6.post2
v2.0.6.post1
v2.0.6
v2.0.5
v2.0.4
v2.0.3
v2.0.2
v2.0.1
v2.0.0
v1.0.9
v1.0.8
v1.0.7
v1.0.6
v1.0.5
v1.0.4
v1.0.3.post0
v1.0.3
v1.0.2
v1.0.1
v1.0.0
v0.2.8
v0.2.7
v0.2.6
v0.2.5
v0.2.4
v0.2.3
v0.2.2
v0.2.1
커밋 기록
찾기
작성자
SHA1
메시지
날짜
Tri Dao
a157cc8c9b
[FT] Implement MQA/GQA
1 년 전
Tri Dao
62e9814466
[Rotary] Make sure frequency calculation is in fp32
1 년 전
Tri Dao
48bc6eacd6
[Gen] Add rotary base as an argument to FT attention kernel
1 년 전
Tri Dao
a01d1213d7
[Gen] Add kernel from FasterTransformer for benchmarking
2 년 전