Commit History

Author SHA1 Message Date
  AlpinDale f91991f584 fix: f-string fixes 5 months ago
  AlpinDale 99680b2d23 feat: soft prompts (#589) 5 months ago
  AlpinDale 0a6db357d8 fix: use safetensor keys instead of adapter_config.json to find unexpected modules 5 months ago
  AlpinDale 85ef2fe8b1 chore: clean up placeholder symbols 5 months ago
  AlpinDale 56e0b8223c chore: add base class for LoRA-supported models 5 months ago
  AlpinDale 25feb1d592 chore: add support for pinning lora adapters in the lru cache 5 months ago
  AlpinDale 9e73559eba make use of batched rotary embedding kernels to support long context lora 6 months ago
  AlpinDale eaa06fdd14 fix some f-strings 6 months ago
  AlpinDale b55381df0e speedup lora loading times by resuing the cpu dummy lora 6 months ago
  AlpinDale e87c32bed3 feat: full tensor parallel for LoRA layers (#545) 6 months ago
  AlpinDale 8be299e78b fix: lora load check 8 months ago
  AlpinDale 9d81716bfd [v0.5.3] Release Candidate (#388) 8 months ago
  AlpinDale e3252edd07 fix: remove event and stream, add typing (#382) 9 months ago
  AlpinDale e42a78381a feat: switch from pylint to ruff (#322) 10 months ago
  AlpinDale 657aec0cbd refactor: OpenAI endpoint (#261) 10 months ago
  AlpinDale 697c06c4f5 fix: LoRA support for mixtral (#276) 11 months ago
  AlpinDale c0aac15421 feat: S-LoRA support (#222) 1 year ago