.. |
__init__.py
|
2bd6c92f73
fix: lora inclusion in wheels
|
hai 1 ano |
fully_sharded_layers.py
|
e87c32bed3
feat: full tensor parallel for LoRA layers (#545)
|
hai 7 meses |
layers.py
|
c975bba905
fix: sharded state loader with lora
|
hai 7 meses |
lora.py
|
e87c32bed3
feat: full tensor parallel for LoRA layers (#545)
|
hai 7 meses |
models.py
|
9e73559eba
make use of batched rotary embedding kernels to support long context lora
|
hai 7 meses |
punica.py
|
e87c32bed3
feat: full tensor parallel for LoRA layers (#545)
|
hai 7 meses |
request.py
|
5b0c11d190
support pipeline parallel pynccl groups
|
hai 7 meses |
utils.py
|
c975bba905
fix: sharded state loader with lora
|
hai 7 meses |
worker_manager.py
|
5fecc6b025
when was this deprecated?
|
hai 7 meses |