Histórico de Commits

Autor SHA1 Mensagem Data
  AlpinDale 1efd0f89b7 feat: support FP8 for DeepSeekV2 MoE há 6 meses atrás
  AlpinDale cdc0e498a9 fix: illegal memory access in FP8 MoE kernel há 6 meses atrás
  AlpinDale 3e7d5f7d14 chore: reloading fused_moe config on the last chunk há 6 meses atrás
  AlpinDale 3b2666314d fix: add chunking mechanism to fused_moe há 6 meses atrás
  AlpinDale 336eb4dbf8 fix: raise error in moe kernel if it receives more than 65k tokens há 6 meses atrás
  AlpinDale bbde979ecd DeepSeek-V2 (#579) há 6 meses atrás
  AlpinDale 156f577f79 feat: switch from `PYBIND11_MODULE` to `TORCH_LIBRARY` (#569) há 6 meses atrás
  AlpinDale 4bdd2f9892 chore: enhance MoE benchmarking há 6 meses atrás
  AlpinDale 00acf371f9 rocm: fused topk softmax há 6 meses atrás
  AlpinDale 1e35cef979 feat: add arctic snowflake model (#551) há 6 meses atrás
  AlpinDale 0751a2ecf6 fix expert_ids shape in Moe há 7 meses atrás
  AlpinDale db9beeb79c fix typo há 7 meses atrás
  AlpinDale b565928d3f fix: compute_dtype in MoE kernel há 7 meses atrás
  AlpinDale 36660b55c2 chore: mixtral fp8 w/ static scales (#542) há 7 meses atrás
  AlpinDale fca911ee0a vLLM Upstream Sync (#526) há 7 meses atrás
  AlpinDale 9d81716bfd [v0.5.3] Release Candidate (#388) há 9 meses atrás
  AlpinDale f8dfac6372 chore: attention refactor and upstream sync apr01 (#365) há 10 meses atrás