AlpinDale 017b42c517 chore: use fork as the default method for mp backend 7 månader sedan
..
attention 6a57861fca feat: initial XPU support via intel_extension_for_pytorch (#571) 7 månader sedan
common 0613d91551 fix: kv head calculation with MPT GQA 7 månader sedan
distributed 017b42c517 chore: use fork as the default method for mp backend 7 månader sedan
endpoints c05a45f22f chore: minor updates to throughput benchmark and llm class 7 månader sedan
engine 3c7444c89b fix: asyncio.run hangs in python < 3.12 7 månader sedan
executor 017b42c517 chore: use fork as the default method for mp backend 7 månader sedan
kv_quant e42a78381a feat: switch from pylint to ruff (#322) 1 år sedan
lora 42d2ee0f43 chore: better error logging for unsupported lora weights 7 månader sedan
modeling 025322ee5f fix: fp8 kv cache for qwen2 models 7 månader sedan
multimodal f2e94e2184 chore: minor llava cleanups in preparation for llava-next 7 månader sedan
processing f9a10145d1 fix: v2 block manager + prefix caching 7 månader sedan
quantization cd9ed8623b fix: cuda version check for fp8 support in the cutlass kernels 7 månader sedan
spec_decode 313e6e1ec7 feat: add typical acceptance sampling 7 månader sedan
task_handler 6a57861fca feat: initial XPU support via intel_extension_for_pytorch (#571) 7 månader sedan
transformers_utils bba89fc6d3 chore: make the automatic rope scaling behave properly with rope_scaling arg, add rope theta 7 månader sedan
__init__.py a07fc83bc8 chore: proper util for aphrodite version 7 månader sedan
_custom_ops.py cd9ed8623b fix: cuda version check for fp8 support in the cutlass kernels 7 månader sedan
_ipex_ops.py 6a57861fca feat: initial XPU support via intel_extension_for_pytorch (#571) 7 månader sedan
py.typed 1c988a48b2 fix logging and add py.typed 1 år sedan
version.py 7e54c3916d chore: factor out epilogues from cutlass kernels 7 månader sedan