AlpinDale a788ca33bf hack in custom bias for attention kernels 9 mēneši atpakaļ
..
all_reduce 6305e6f3f2 fix: no repeated IPC registration (#227) 11 mēneši atpakaļ
attention a788ca33bf hack in custom bias for attention kernels 9 mēneši atpakaļ
hadamard 5d288aa76c feat: add fast hadamard transformation kernels (#232) 11 mēneši atpakaļ
moe 7d6ba53602 feat: fused top-k kernels for MoE (#273) 10 mēneši atpakaļ
punica c0aac15421 feat: S-LoRA support (#222) 11 mēneši atpakaļ
quantization 89c32b40ec chore: add new imatrix quants (#320) 10 mēneši atpakaļ
activation_kernels.cu e31c6f0b45 feat: refactor modeling logic and support more models (#274) 10 mēneši atpakaļ
cache.h 9810daa699 feat: INT8 KV Cache (#298) 10 mēneši atpakaļ
cache_kernels.cu 9810daa699 feat: INT8 KV Cache (#298) 10 mēneši atpakaļ
cuda_compat.h 8fa608aeb7 feat: replace Ray with NCCL for control plane comms (#221) 11 mēneši atpakaļ
cuda_utils.h 31c95011a6 feat: FP8 E5M2 KV Cache (#226) 11 mēneši atpakaļ
cuda_utils_kernels.cu 31c95011a6 feat: FP8 E5M2 KV Cache (#226) 11 mēneši atpakaļ
dispatch_utils.h 9810daa699 feat: INT8 KV Cache (#298) 10 mēneši atpakaļ
layernorm_kernels.cu 8fa608aeb7 feat: replace Ray with NCCL for control plane comms (#221) 11 mēneši atpakaļ
ops.h a788ca33bf hack in custom bias for attention kernels 9 mēneši atpakaļ
pos_encoding_kernels.cu 8fa608aeb7 feat: replace Ray with NCCL for control plane comms (#221) 11 mēneši atpakaļ
pybind.cpp c41462cfcd feat: exllamav2 quantization (#305) 10 mēneši atpakaļ
reduction.cuh 2755a48d51 merge dev branch into main (#153) 1 gadu atpakaļ