AlpinDale 5b0c11d190 support pipeline parallel pynccl groups 8 mesi fa
..
__init__.py 2bd6c92f73 fix: lora inclusion in wheels 1 anno fa
fully_sharded_layers.py e87c32bed3 feat: full tensor parallel for LoRA layers (#545) 8 mesi fa
layers.py 9e73559eba make use of batched rotary embedding kernels to support long context lora 8 mesi fa
lora.py e87c32bed3 feat: full tensor parallel for LoRA layers (#545) 8 mesi fa
models.py 9e73559eba make use of batched rotary embedding kernels to support long context lora 8 mesi fa
punica.py e87c32bed3 feat: full tensor parallel for LoRA layers (#545) 8 mesi fa
request.py 5b0c11d190 support pipeline parallel pynccl groups 8 mesi fa
utils.py 9e73559eba make use of batched rotary embedding kernels to support long context lora 8 mesi fa
worker_manager.py 9e73559eba make use of batched rotary embedding kernels to support long context lora 8 mesi fa