AlpinDale 93cffaf446 add flash_attn back пре 7 месеци
..
attention 93cffaf446 add flash_attn back пре 7 месеци
common 9e73559eba make use of batched rotary embedding kernels to support long context lora пре 7 месеци
distributed c58589318f remove the graph mode func пре 7 месеци
endpoints fe431bb840 check for next port if current is unavailable пре 7 месеци
engine 9e73559eba make use of batched rotary embedding kernels to support long context lora пре 7 месеци
executor eaa06fdd14 fix some f-strings пре 7 месеци
kv_quant e42a78381a feat: switch from pylint to ruff (#322) пре 1 година
lora 9e73559eba make use of batched rotary embedding kernels to support long context lora пре 7 месеци
modeling f970f3f3fb add base class for VLMs пре 7 месеци
processing 9e73559eba make use of batched rotary embedding kernels to support long context lora пре 7 месеци
quantization 8e11259e90 missing triton autoconfig for rocm flash attn пре 7 месеци
spec_decode 236be273e5 feat: tensor parallel speculative decoding (#554) пре 7 месеци
task_handler 9e73559eba make use of batched rotary embedding kernels to support long context lora пре 7 месеци
transformers_utils 9e73559eba make use of batched rotary embedding kernels to support long context lora пре 7 месеци
__init__.py be8154a8a0 feat: proper embeddings API with e5-mistral-7b support пре 7 месеци
py.typed 1c988a48b2 fix logging and add py.typed пре 1 година