.. |
attention
|
4e71bd1d12
feat: add PagedAttention V2 kernels (#76)
|
пре 1 година |
quantization
|
ce5e2332ea
fix: launch AWQ kernels on the current CUDAStream (#75)
|
пре 1 година |
activation.cpp
|
32844c1522
add GELU kernels and remove compile bloat
|
пре 1 година |
activation_kernels.cu
|
3d72f05c7b
feat: flattened 1D tensor -> 2D tensor (#85)
|
пре 1 година |
attention.cpp
|
4e71bd1d12
feat: add PagedAttention V2 kernels (#76)
|
пре 1 година |
cache.cpp
|
081545bde6
fix: various CUDA kernel tweaks
|
пре 1 година |
cache_kernels.cu
|
3d72f05c7b
feat: flattened 1D tensor -> 2D tensor (#85)
|
пре 1 година |
cuda_utils.cpp
|
75c27d3e65
massive overhaul
|
пре 1 година |
cuda_utils_kernels.cu
|
75c27d3e65
massive overhaul
|
пре 1 година |
dispatch_utils.h
|
32844c1522
add GELU kernels and remove compile bloat
|
пре 1 година |
layernorm.cpp
|
081545bde6
fix: various CUDA kernel tweaks
|
пре 1 година |
layernorm_kernels.cu
|
3d72f05c7b
feat: flattened 1D tensor -> 2D tensor (#85)
|
пре 1 година |
pos_encoding.cpp
|
45f6d9f923
initial refactor commit
|
пре 1 година |
pos_encoding_kernels.cu
|
3d72f05c7b
feat: flattened 1D tensor -> 2D tensor (#85)
|
пре 1 година |
quantization.cpp
|
0495c50a3e
GPTQ+exllama support (#21)
|
пре 1 година |
reduction.cuh
|
081545bde6
fix: various CUDA kernel tweaks
|
пре 1 година |