Geçmişin Kaydedilmesi

Yazar SHA1 Mesaj Tarih
  AlpinDale 523ac99aca chore: pipeline parallel with Ray accelerated dag 6 ay önce
  AlpinDale 6124140a45 fix: remove error_on_invalid_device_count_status 6 ay önce
  AlpinDale 8fc6bc6f8e fix: define self.forward_dag before init_workers_ray 7 ay önce
  AlpinDale 45a004874c chore: allow specifying custom Executor 7 ay önce
  AlpinDale b7a2d52e47 fix: allow using mp executor for pipeline parallel 7 ay önce
  AlpinDale 052a6e1eb6 feat: add SPMD worker execution using Ray accelerated DAG 7 ay önce
  AlpinDale 6f8beb8583 fix: 4-node crash with PP 7 ay önce
  AlpinDale 23408b9b2b chore: skip the driver worker 7 ay önce
  AlpinDale 1562e073c6 fix: ray worker rank assigment 7 ay önce
  AlpinDale 4c3bb0b436 fix: pipeline parallel on python 3.8 and 3.9 7 ay önce
  AlpinDale 5257ebce8c fix: device >= 0 && device < num_gpus INTERNAL_ASSERT FAILED 7 ay önce
  AlpinDale ae04f57ec1 feat: Pipeline Parallel support (#581) 7 ay önce
  AlpinDale 405bb74612 Control plane comms refactor (#573) 7 ay önce
  AlpinDale 323fe23b21 chore: use 127.0.0.1 for single-node setups 7 ay önce
  AlpinDale dfa59bc5f9 fix: 16 GPUs in a cluster 7 ay önce
  AlpinDale 17eb1b7eb9 chore: remove ray health check 7 ay önce
  AlpinDale de62ceb18c refactor: eliminate parallel worker per-step task scheduling overhead 8 ay önce
  AlpinDale 9f3d6205ce fix ray gpu executor 8 ay önce
  AlpinDale 236be273e5 feat: tensor parallel speculative decoding (#554) 8 ay önce
  AlpinDale c6a501f682 add multiprocessing executor; make ray optional 8 ay önce
  AlpinDale ef733aee43 implement ExecuteModelData to reduce executor complexity 8 ay önce
  AlpinDale 7bcf4c3fc9 centralize gpu worker construction 8 ay önce
  AlpinDale fb982981ce num_lookahead_slots in neuron and ray executors 8 ay önce
  AlpinDale 957ed7d244 type hints 8 ay önce
  AlpinDale c21af7acad feat: `DistributedGPUExecutor` abstract class (#541) 8 ay önce
  AlpinDale 199e776722 chore: move ray utils to executor dir 9 ay önce
  AlpinDale 46159b107a formatting: pt1 9 ay önce
  AlpinDale fca911ee0a vLLM Upstream Sync (#526) 9 ay önce
  AlpinDale f894f7b176 Revert "reduce dedupe by wrapping in general worker class" 10 ay önce
  AlpinDale 082b0b03bc Revert "actually run the workers" 10 ay önce