.. |
all_reduce
|
156f577f79
feat: switch from `PYBIND11_MODULE` to `TORCH_LIBRARY` (#569)
|
5 months ago |
attention
|
dc1b59df9c
fix: compiler warnings for _C and _moe
|
4 months ago |
backup
|
f8dfac6372
chore: attention refactor and upstream sync apr01 (#365)
|
9 months ago |
core
|
141672a0d4
kernels: disambiguate quantized types via a new ScalarType
|
4 months ago |
cpu
|
141672a0d4
kernels: disambiguate quantized types via a new ScalarType
|
4 months ago |
hadamard
|
5d288aa76c
feat: add fast hadamard transformation kernels (#232)
|
11 months ago |
mamba
|
dc1b59df9c
fix: compiler warnings for _C and _moe
|
4 months ago |
moe
|
141672a0d4
kernels: disambiguate quantized types via a new ScalarType
|
4 months ago |
prepare_inputs
|
dd18c5042c
move prepare_inputs to the GPU (#596)
|
4 months ago |
punica
|
848731f527
chore: add punica sizes for mistral nemo
|
4 months ago |
quantization
|
0e6c400b13
feat: re-add GGUF (#600)
|
4 months ago |
activation_kernels.cu
|
5fbd93797d
fix: beta value in gelu_tanh kernel being divided by 0.5
|
4 months ago |
cache.h
|
32bdbd1ee4
chore: add fp8 support to `reshape_and_cache_flash`
|
4 months ago |
cache_kernels.cu
|
32bdbd1ee4
chore: add fp8 support to `reshape_and_cache_flash`
|
4 months ago |
cuda_compat.h
|
00acf371f9
rocm: fused topk softmax
|
5 months ago |
cuda_utils.h
|
156f577f79
feat: switch from `PYBIND11_MODULE` to `TORCH_LIBRARY` (#569)
|
5 months ago |
cuda_utils_kernels.cu
|
156f577f79
feat: switch from `PYBIND11_MODULE` to `TORCH_LIBRARY` (#569)
|
5 months ago |
dispatch_utils.h
|
dc1b59df9c
fix: compiler warnings for _C and _moe
|
4 months ago |
layernorm_kernels.cu
|
dc1b59df9c
fix: compiler warnings for _C and _moe
|
4 months ago |
ops.h
|
dd18c5042c
move prepare_inputs to the GPU (#596)
|
4 months ago |
pos_encoding_kernels.cu
|
156f577f79
feat: switch from `PYBIND11_MODULE` to `TORCH_LIBRARY` (#569)
|
5 months ago |
reduction.cuh
|
aba03b4756
feat: dynamic per-token activation quantization
|
5 months ago |
torch_bindings.cpp
|
0e6c400b13
feat: re-add GGUF (#600)
|
4 months ago |