Histórico de Commits

Autor SHA1 Mensagem Data
  AlpinDale 6c1eab6a6c feat: non-blocking transfer in prepare_input há 4 meses atrás
  AlpinDale 212b9d8a03 refactor: add has_prefix_cache_hit flag to FlashAttentionMetadataBuilder há 4 meses atrás
  AlpinDale 614ca6b0bf feat: support logits soft capping with flash attention backend há 4 meses atrás
  AlpinDale b15e6376f8 bump to torch 2.4.0, add aphrodite_flash_attn (#614) há 4 meses atrás
  AlpinDale 705e50f4bd fix: broadcasting logic for multi_modal_kwargs há 4 meses atrás
  AlpinDale 98ad5a6cba fix: decode tokens w/ CUDA graphs and graps with flashinfer há 4 meses atrás
  AlpinDale 32bdbd1ee4 chore: add fp8 support to `reshape_and_cache_flash` há 4 meses atrás
  AlpinDale 84a9cd25c9 fix: some naming issues há 4 meses atrás
  AlpinDale 51ea8ad376 chore: modularize prepare input and attn metadata builder há 4 meses atrás
  AlpinDale 22305c91e9 refactor _prepare_model_input_tensor and attn metadata builder for most backends há 4 meses atrás
  AlpinDale 9d7beaa5b9 chore: separate kv_scale into k_scale and v_scale há 4 meses atrás
  AlpinDale 2105e4fd6b feat: correctly invoke prefill & decode kernels for cross-attention há 4 meses atrás
  AlpinDale 405bb74612 Control plane comms refactor (#573) há 5 meses atrás
  AlpinDale ab7f4ed6e5 chore: revert commit for removing unnecessary copies in flash attn backend há 5 meses atrás
  AlpinDale 156f577f79 feat: switch from `PYBIND11_MODULE` to `TORCH_LIBRARY` (#569) há 5 meses atrás
  AlpinDale 75f97bc25d bump flash-attn to remove unnecessary copies in the backend há 5 meses atrás
  AlpinDale 696f2cd59c add phi3_small support with blocksparse attention há 5 meses atrás
  AlpinDale b8b63eb5ca fix head_size check for flash attention backend há 5 meses atrás
  AlpinDale 93cffaf446 add flash_attn back há 5 meses atrás
  AlpinDale 0c15965621 fix fp8 kv há 5 meses atrás
  AlpinDale a94de94c44 refactor: combine the prefill and decode into a single API (#553) há 5 meses atrás
  AlpinDale 01190e5049 use flash attention for the decoding phase há 5 meses atrás
  AlpinDale 50b7c13db0 refactor: attention selector (#552) há 5 meses atrás
  AlpinDale d11d68f4e6 switch to vllm-flash-attn há 5 meses atrás
  AlpinDale 8b56dc4347 dict -> torch.Tensor for blocks_to_swap há 5 meses atrás
  AlpinDale 3a0d1c7705 add get_name method to attention backends há 5 meses atrás
  AlpinDale 21ce19b3ea blocks_to_copy dict -> torch.Tensor há 5 meses atrás
  AlpinDale 35ae01d7ba refactor: attention metadata term há 5 meses atrás
  AlpinDale aa15d3dc0f sliding window in prefix prefill kernel há 5 meses atrás
  AlpinDale fca911ee0a vLLM Upstream Sync (#526) há 6 meses atrás