Historique des commits

Auteur SHA1 Message Date
  AlpinDale 1390915778 multi-step: add support for flashinfer attention backend (#1033) il y a 4 semaines
  AlpinDale 6f59024522 torch.compile: hide slicing under custom op for inductor (#1029) il y a 4 semaines
  AlpinDale de341ffb00 fix: ensure multistep lookahead allocation is compatible with cugraph max capture (#1008) il y a 4 semaines
  AlpinDale 5c3b94de45 spec decode: move ops.advane_step to flash attention backend (#1005) il y a 4 semaines
  AlpinDale 3bb0f07461 chore: rename `task_handler` to `worker` (#985) il y a 1 mois
  AlpinDale 1405051912 attention: add `AttentionState` abstraction (#863) il y a 1 mois
  AlpinDale 7a313483f1 chore: move update_flash_attn_metadata to attn backend (#731) il y a 4 mois
  AlpinDale 60b702a827 chore: register custom torch ops for flash-attn and flashinfer (#724) il y a 4 mois
  AlpinDale 24456206a9 fix: logit softcapping in flash-attn (#688) il y a 4 mois
  AlpinDale 7df7b8ca53 optimization: reduce end-to-end overhead from python obj allocation (#666) il y a 4 mois
  AlpinDale f1d0b77c92 [0.6.0] Release Candidate (#481) il y a 4 mois
  AlpinDale 9d81716bfd [v0.5.3] Release Candidate (#388) il y a 8 mois