AlpinDale 7ca63930c8 support deepseek_v3 model hace 1 semana
..
all_reduce f1d0b77c92 [0.6.0] Release Candidate (#481) hace 4 meses
attention 3bb0f07461 chore: rename `task_handler` to `worker` (#985) hace 2 semanas
backup 3bb0f07461 chore: rename `task_handler` to `worker` (#985) hace 2 semanas
core f1ea7711bd core: do not compile ScalarType for torch < 2.4.0 (#938) hace 2 semanas
cpu a113309876 kernel: add meta functions for ops to prevent graph breaks (#1019) hace 1 semana
cutlass_extensions 93bc863591 feat: Machete Kernels for Hopper GPUs (#842) hace 1 mes
hadamard 5d288aa76c feat: add fast hadamard transformation kernels (#232) hace 11 meses
mamba 0256ed236b feat: windows support (#790) hace 2 meses
moe 7ca63930c8 support deepseek_v3 model hace 1 semana
prepare_inputs 1390915778 multi-step: add support for flashinfer attention backend (#1033) hace 1 semana
punica f1d0b77c92 [0.6.0] Release Candidate (#481) hace 4 meses
quantization a113309876 kernel: add meta functions for ops to prevent graph breaks (#1019) hace 1 semana
sampling bfc8988116 feat: add cuda sampling kernels for top_k and top_p (#828) hace 1 mes
activation_kernels.cu f1d0b77c92 [0.6.0] Release Candidate (#481) hace 4 meses
cache.h f1d0b77c92 [0.6.0] Release Candidate (#481) hace 4 meses
cache_kernels.cu 0256ed236b feat: windows support (#790) hace 2 meses
cuda_compat.h f1d0b77c92 [0.6.0] Release Candidate (#481) hace 4 meses
cuda_utils.h 93bc863591 feat: Machete Kernels for Hopper GPUs (#842) hace 1 mes
cuda_utils_kernels.cu f1d0b77c92 [0.6.0] Release Candidate (#481) hace 4 meses
dispatch_utils.h f1d0b77c92 [0.6.0] Release Candidate (#481) hace 4 meses
layernorm_kernels.cu e14223dce5 kernel: use `cub::BlockReduce` instead of custom impl (#895) hace 3 semanas
ops.h 1390915778 multi-step: add support for flashinfer attention backend (#1033) hace 1 semana
permute_cols.cu 93bc863591 feat: Machete Kernels for Hopper GPUs (#842) hace 1 mes
pos_encoding_kernels.cu f1d0b77c92 [0.6.0] Release Candidate (#481) hace 4 meses
torch_bindings.cpp 1390915778 multi-step: add support for flashinfer attention backend (#1033) hace 1 semana