AlpinDale 696f2cd59c add phi3_small support with blocksparse attention hai 5 meses
..
all_reduce 9d81716bfd [v0.5.3] Release Candidate (#388) hai 8 meses
attention 696f2cd59c add phi3_small support with blocksparse attention hai 5 meses
backup f8dfac6372 chore: attention refactor and upstream sync apr01 (#365) hai 9 meses
cpu 696f2cd59c add phi3_small support with blocksparse attention hai 5 meses
hadamard 5d288aa76c feat: add fast hadamard transformation kernels (#232) hai 11 meses
moe 9d81716bfd [v0.5.3] Release Candidate (#388) hai 8 meses
punica 3bdeb3e116 fix: clang formatting for all kernels (#558) hai 5 meses
quantization f4ea11b982 feat: initial support for activation quantization hai 5 meses
activation_kernels.cu 3d6695cfbb feat: add approximate gelu activation kernels (#370) hai 9 meses
cache.h 3bdeb3e116 fix: clang formatting for all kernels (#558) hai 5 meses
cache_kernels.cu 3bdeb3e116 fix: clang formatting for all kernels (#558) hai 5 meses
cuda_compat.h 3bdeb3e116 fix: clang formatting for all kernels (#558) hai 5 meses
cuda_utils.h 31c95011a6 feat: FP8 E5M2 KV Cache (#226) hai 11 meses
cuda_utils_kernels.cu 31c95011a6 feat: FP8 E5M2 KV Cache (#226) hai 11 meses
dispatch_utils.h f8dfac6372 chore: attention refactor and upstream sync apr01 (#365) hai 9 meses
layernorm_kernels.cu 9d81716bfd [v0.5.3] Release Candidate (#388) hai 8 meses
ops.h 696f2cd59c add phi3_small support with blocksparse attention hai 5 meses
pos_encoding_kernels.cu e702f587cf feat: add batched RoPE kernels (#371) hai 9 meses
pybind.cpp 3bdeb3e116 fix: clang formatting for all kernels (#558) hai 5 meses
reduction.cuh 9d81716bfd [v0.5.3] Release Candidate (#388) hai 8 meses