Tri Dao
|
393882bc08
[LayerNorm] Implement LN with parallel residual, support dim 8k
|
1 year ago |
Tri Dao
|
eb33e587e9
[LayerNorm] Rename x1 -> residual
|
1 year ago |
Tri Dao
|
6738d9477d
[LayerNorm] Implement RMS Norm
|
1 year ago |
Tri Dao
|
5db330519a
[LayerNorm] Support taking subset of input or subset of output
|
2 years ago |
Tri Dao
|
ae137ed17a
[LayerNorm] Fuse LayerScale
|
2 years ago |
Tri Dao
|
8c6609ae1a
[LayerNorm] Support all dimensions up to 6k (if divisible by 8)
|
2 years ago |
Tri Dao
|
fa6d1ce44f
Add fused_dense and dropout_add_layernorm CUDA extensions
|
2 years ago |