Histórico de Commits

Autor SHA1 Mensagem Data
  AlpinDale 9bbc75d2e3 wip há 6 meses atrás
  AlpinDale 60af35bc34 wip há 6 meses atrás
  AlpinDale 74cb1aad4e wip há 6 meses atrás
  AlpinDale 5d965b34a7 bitnet -> bitblas in reqs há 6 meses atrás
  AlpinDale 5884e0b904 add bitnetforcausallm support há 6 meses atrás
  AlpinDale 2649f3f14e aqlm works on pascal há 6 meses atrás
  AlpinDale ac79d115b3 add guards for prefix caching, fp8, chunked, etc há 6 meses atrás
  AlpinDale 344ddaac5a properly disable speculative decoding há 6 meses atrás
  AlpinDale 696f2cd59c add phi3_small support with blocksparse attention há 6 meses atrás
  AlpinDale 0d15aa3ab3 fix prefix caching for block manager v2 há 6 meses atrás
  AlpinDale 7d0884de9a fix mistral v0.3 weight loading há 6 meses atrás
  AlpinDale e8b7f53321 allow prompt token IDs in the logits processor api há 6 meses atrás
  AlpinDale f4ea11b982 feat: initial support for activation quantization há 6 meses atrás
  Drake e1a142c179 Fix OpenAI chat completions compatibility (#559) há 6 meses atrás
  AlpinDale 5b0c11d190 support pipeline parallel pynccl groups há 6 meses atrás
  AlpinDale f6250c5516 move dockerfiles to root; fix cpu build há 6 meses atrás
  AlpinDale d8667fcb98 improve gptq_marlin_24 prefill performance há 6 meses atrás
  AlpinDale eb2c5c77df feat: enforce the max possible seqlen há 6 meses atrás
  AlpinDale 19a959a03e prioritize user selection for attention há 6 meses atrás
  AlpinDale c1ed789835 fix: typo in llama.py há 6 meses atrás
  AlpinDale 4e1ae004da make mp the default distributed backend há 6 meses atrás
  AlpinDale de62ceb18c refactor: eliminate parallel worker per-step task scheduling overhead há 6 meses atrás
  AlpinDale 656459fd84 make fp8_e4m3 work on nvidia há 6 meses atrás
  AlpinDale 6e626b902c fix cutlass w8a8 kernels for cuda stream há 6 meses atrás
  AlpinDale 3bdeb3e116 fix: clang formatting for all kernels (#558) há 6 meses atrás
  AlpinDale 04d22bf1a9 add clang-format há 6 meses atrás
  AlpinDale 60e74e92fd add rope_scaling arg há 6 meses atrás
  AlpinDale b8b63eb5ca fix head_size check for flash attention backend há 6 meses atrás
  AlpinDale 8077af0b2f add lora support for phi há 6 meses atrás
  AlpinDale 295cfb2f39 add rope scaling for qwen2 há 6 meses atrás