AlpinDale 7ca63930c8 support deepseek_v3 model vor 1 Woche
..
all_reduce f1d0b77c92 [0.6.0] Release Candidate (#481) vor 4 Monaten
attention 3bb0f07461 chore: rename `task_handler` to `worker` (#985) vor 2 Wochen
backup 3bb0f07461 chore: rename `task_handler` to `worker` (#985) vor 2 Wochen
core f1ea7711bd core: do not compile ScalarType for torch < 2.4.0 (#938) vor 2 Wochen
cpu a113309876 kernel: add meta functions for ops to prevent graph breaks (#1019) vor 1 Woche
cutlass_extensions 93bc863591 feat: Machete Kernels for Hopper GPUs (#842) vor 1 Monat
hadamard 5d288aa76c feat: add fast hadamard transformation kernels (#232) vor 11 Monaten
mamba 0256ed236b feat: windows support (#790) vor 2 Monaten
moe 7ca63930c8 support deepseek_v3 model vor 1 Woche
prepare_inputs 1390915778 multi-step: add support for flashinfer attention backend (#1033) vor 1 Woche
punica f1d0b77c92 [0.6.0] Release Candidate (#481) vor 4 Monaten
quantization a113309876 kernel: add meta functions for ops to prevent graph breaks (#1019) vor 1 Woche
sampling bfc8988116 feat: add cuda sampling kernels for top_k and top_p (#828) vor 1 Monat
activation_kernels.cu f1d0b77c92 [0.6.0] Release Candidate (#481) vor 4 Monaten
cache.h f1d0b77c92 [0.6.0] Release Candidate (#481) vor 4 Monaten
cache_kernels.cu 0256ed236b feat: windows support (#790) vor 2 Monaten
cuda_compat.h f1d0b77c92 [0.6.0] Release Candidate (#481) vor 4 Monaten
cuda_utils.h 93bc863591 feat: Machete Kernels for Hopper GPUs (#842) vor 1 Monat
cuda_utils_kernels.cu f1d0b77c92 [0.6.0] Release Candidate (#481) vor 4 Monaten
dispatch_utils.h f1d0b77c92 [0.6.0] Release Candidate (#481) vor 4 Monaten
layernorm_kernels.cu e14223dce5 kernel: use `cub::BlockReduce` instead of custom impl (#895) vor 3 Wochen
ops.h 1390915778 multi-step: add support for flashinfer attention backend (#1033) vor 1 Woche
permute_cols.cu 93bc863591 feat: Machete Kernels for Hopper GPUs (#842) vor 1 Monat
pos_encoding_kernels.cu f1d0b77c92 [0.6.0] Release Candidate (#481) vor 4 Monaten
torch_bindings.cpp 1390915778 multi-step: add support for flashinfer attention backend (#1033) vor 1 Woche