AlpinDale 7d79c0e726 chore: use nvml query to avoid accidental cuda initialization 7 ay önce
..
blocksparse_attention 7d79c0e726 chore: use nvml query to avoid accidental cuda initialization 7 ay önce
__init__.py 9d81716bfd [v0.5.3] Release Candidate (#388) 10 ay önce
ipex_attn.py 805fa8721d feat: use intel_extension_for_pytorch for CPU backend 7 ay önce
paged_attn.py 156f577f79 feat: switch from `PYBIND11_MODULE` to `TORCH_LIBRARY` (#569) 7 ay önce
prefix_prefill.py 7d79c0e726 chore: use nvml query to avoid accidental cuda initialization 7 ay önce
triton_flash_attn.py 8e11259e90 missing triton autoconfig for rocm flash attn 8 ay önce