História revízii

Autor SHA1 Správa Dátum
  AlpinDale 6a57861fca feat: initial XPU support via intel_extension_for_pytorch (#571) 7 mesiacov pred
  AlpinDale a89c9a0e92 fix: device ordinal issues with world_size and stuff 7 mesiacov pred
  AlpinDale fe21123a1c feat: TPU support (#570) 7 mesiacov pred
  AlpinDale 156f577f79 feat: switch from `PYBIND11_MODULE` to `TORCH_LIBRARY` (#569) 7 mesiacov pred
  AlpinDale b029a544ff optimize eager mode host time with numpy 7 mesiacov pred
  AlpinDale f2b7a42c4e fix: async cancels in merge_async_iterators for python>=3.9 7 mesiacov pred
  AlpinDale 7194047318 remove vllm-nccl 7 mesiacov pred
  AlpinDale 90ceab32ff refactor: consolidate prompt args to LLM engines 7 mesiacov pred
  AlpinDale 656459fd84 make fp8_e4m3 work on nvidia 8 mesiacov pred
  AlpinDale 251568470e initial nvidia fp8 e4m3 for kv cache 8 mesiacov pred
  AlpinDale 4476d2d885 remove cuda version check 8 mesiacov pred
  AlpinDale 2351a0e2cd feat: FlashInfer backend for decoding phase (#548) 8 mesiacov pred
  AlpinDale 2656df543b why was this removed? weird 8 mesiacov pred
  AlpinDale 2e0b115ce1 move func tracing to utils 8 mesiacov pred
  AlpinDale 46159b107a formatting: pt1 8 mesiacov pred
  AlpinDale fca911ee0a vLLM Upstream Sync (#526) 8 mesiacov pred
  AlpinDale f894f7b176 Revert "reduce dedupe by wrapping in general worker class" 10 mesiacov pred
  AlpinDale 9fff6fb507 reduce dedupe by wrapping in general worker class 10 mesiacov pred
  AlpinDale 9d81716bfd [v0.5.3] Release Candidate (#388) 10 mesiacov pred
  AlpinDale e3252edd07 fix: remove event and stream, add typing (#382) 11 mesiacov pred
  AlpinDale 33b3786175 fix: cache neuron checks (#379) 11 mesiacov pred
  AlpinDale f8dfac6372 chore: attention refactor and upstream sync apr01 (#365) 11 mesiacov pred
  AlpinDale e53842bd5d fix: cuda home detection for fp8 kv cache 1 rok pred
  AlpinDale e42a78381a feat: switch from pylint to ruff (#322) 1 rok pred
  AlpinDale c2d77b1822 chore: logging refactor (#302) 1 rok pred
  AlpinDale 9810daa699 feat: INT8 KV Cache (#298) 1 rok pred
  AlpinDale ea0f57b233 feat: allow further support for non-cuda devices (#247) 1 rok pred
  AlpinDale 31c95011a6 feat: FP8 E5M2 KV Cache (#226) 1 rok pred
  AlpinDale c0aac15421 feat: S-LoRA support (#222) 1 rok pred
  AlpinDale 8fa608aeb7 feat: replace Ray with NCCL for control plane comms (#221) 1 rok pred