AlpinDale 9be43994fe feat: fbgemm quantization support (#601) 5 mesiacov pred
..
configs fca911ee0a vLLM Upstream Sync (#526) 7 mesiacov pred
__init__.py cf472315cc refactor: isolate FP8 from mixtral 6 mesiacov pred
fused_moe.py 1efd0f89b7 feat: support FP8 for DeepSeekV2 MoE 5 mesiacov pred
layer.py 9be43994fe feat: fbgemm quantization support (#601) 5 mesiacov pred
moe_pallas.py e1475fbec7 feat: MoE support with Pallas GMM kernel for TPUs 5 mesiacov pred