.. |
__init__.py
|
2bd6c92f73
fix: lora inclusion in wheels
|
vor 1 Jahr |
fully_sharded_layers.py
|
e87c32bed3
feat: full tensor parallel for LoRA layers (#545)
|
vor 6 Monaten |
layers.py
|
9e73559eba
make use of batched rotary embedding kernels to support long context lora
|
vor 6 Monaten |
lora.py
|
e87c32bed3
feat: full tensor parallel for LoRA layers (#545)
|
vor 6 Monaten |
models.py
|
9e73559eba
make use of batched rotary embedding kernels to support long context lora
|
vor 6 Monaten |
punica.py
|
e87c32bed3
feat: full tensor parallel for LoRA layers (#545)
|
vor 6 Monaten |
request.py
|
5b0c11d190
support pipeline parallel pynccl groups
|
vor 6 Monaten |
utils.py
|
9e73559eba
make use of batched rotary embedding kernels to support long context lora
|
vor 6 Monaten |
worker_manager.py
|
9e73559eba
make use of batched rotary embedding kernels to support long context lora
|
vor 6 Monaten |