Commit History

Author SHA1 Message Date
  AlpinDale 826de3ef93 use flash attention with xformers 1 year ago
  AlpinDale aba8b0b17a add rope theta support and bump transformers 1 year ago
  AlpinDale c687430ce7 bump xformers and clean up leftover code 1 year ago
  AlpinDale 32844c1522 add GELU kernels and remove compile bloat 1 year ago
  AlpinDale c318602c42 update setuptools 1 year ago
  AlpinDale 0aa5d13909 clear cuda cache and state 1 year ago
  AlpinDale 62dfd1a883 fix top_k implementation 1 year ago
  AlpinDale 1482542239 fix: forgot an import 1 year ago
  AlpinDale f3f31434c6 Merge pull request #10 from PygmalionAI/feat/safetensor-support 1 year ago
  AlpinDale 91abae0631 fix: typo 1 year ago
  AlpinDale 17b15d74c7 fix: scheduler 1 year ago
  AlpinDale ed540c3c87 fix: typo in attention kernel 1 year ago
  AlpinDale 8c2353e803 llama support for safetensors 1 year ago
  AlpinDale cbb90f0b2a add safetensor support 1 year ago
  AlpinDale fffb9f2dac chore: attention kernel optimizations 1 year ago
  AlpinDale acbf49ef89 feat: refactor scheduler 1 year ago
  AlpinDale 10334ebd7c requirements: stricter version for fschat 1 year ago
  AlpinDale bc725a82e8 fix: leftover name changes from an old migration 1 year ago
  AlpinDale 592ee204a6 fix: ray depends on pyarrow 1 year ago
  AlpinDale 4472e432cf fix: requirements again. 1 year ago
  AlpinDale 7991c14e51 fix: requirements and accidental commit 1 year ago
  AlpinDale 4ac5560152 fix: revert template change for now 1 year ago
  AlpinDale bf132e29d6 feat: bump up version 1 year ago
  AlpinDale 76b2e4a445 Merge dev branch into main (#7) 1 year ago
  AlpinDale 3cdff3cd8b readme: add common issues 1 year ago
  AlpinDale e8eac42213 fix: incorrect call 1 year ago
  AlpinDale b188d1093b test: throughput 1 year ago
  AlpinDale c10b83422d readme: add notes 1 year ago
  AlpinDale 908091008e readme: typo 1 year ago
  AlpinDale 229733d39f feat: bump up the version 1 year ago