AlpinDale 1390915778 multi-step: add support for flashinfer attention backend (#1033) il y a 4 semaines
..
backends 1390915778 multi-step: add support for flashinfer attention backend (#1033) il y a 4 semaines
ops e200775863 feat: enable using fp8 kv and prefix caching with chunked prefill (#668) il y a 4 mois
__init__.py 1405051912 attention: add `AttentionState` abstraction (#863) il y a 1 mois
layer.py bf88c8567e feat: mamba model support (#674) il y a 4 mois
selector.py 4ddc14d653 core: use flashinfer for FP8 KV when available (#944) il y a 1 mois