.. |
backends
|
71a26f0998
chore: use pytorch sdpa backend to do naive attention for rocm
|
7 месяцев назад |
ops
|
ced1b36b8b
feat: support head size of 192
|
7 месяцев назад |
__init__.py
|
a94de94c44
refactor: combine the prefill and decode into a single API (#553)
|
7 месяцев назад |
layer.py
|
ac79d115b3
add guards for prefix caching, fp8, chunked, etc
|
7 месяцев назад |
selector.py
|
696f2cd59c
add phi3_small support with blocksparse attention
|
7 месяцев назад |