Commit History

Автор SHA1 Съобщение Дата
  AlpinDale 1390915778 multi-step: add support for flashinfer attention backend (#1033) преди 4 седмици
  AlpinDale 6f59024522 torch.compile: hide slicing under custom op for inductor (#1029) преди 4 седмици
  AlpinDale de341ffb00 fix: ensure multistep lookahead allocation is compatible with cugraph max capture (#1008) преди 4 седмици
  AlpinDale 5c3b94de45 spec decode: move ops.advane_step to flash attention backend (#1005) преди 4 седмици
  AlpinDale 3bb0f07461 chore: rename `task_handler` to `worker` (#985) преди 1 месец
  AlpinDale 1405051912 attention: add `AttentionState` abstraction (#863) преди 1 месец
  AlpinDale 7a313483f1 chore: move update_flash_attn_metadata to attn backend (#731) преди 4 месеца
  AlpinDale 60b702a827 chore: register custom torch ops for flash-attn and flashinfer (#724) преди 4 месеца
  AlpinDale 24456206a9 fix: logit softcapping in flash-attn (#688) преди 4 месеца
  AlpinDale 7df7b8ca53 optimization: reduce end-to-end overhead from python obj allocation (#666) преди 4 месеца
  AlpinDale f1d0b77c92 [0.6.0] Release Candidate (#481) преди 4 месеца
  AlpinDale 9d81716bfd [v0.5.3] Release Candidate (#388) преди 8 месеца