Joel Lamy-Poirier
|
767b71ccf0
Fix random state for dropout_layer_norm (#315)
|
1 ano atrás |
Ikko Eltociear Ashimine
|
dfc60f6b7d
[LayerNorm] Fix typo in ln_api.cpp
|
1 ano atrás |
Tri Dao
|
393882bc08
[LayerNorm] Implement LN with parallel residual, support dim 8k
|
1 ano atrás |
Tri Dao
|
eb33e587e9
[LayerNorm] Rename x1 -> residual
|
1 ano atrás |
Tri Dao
|
6738d9477d
[LayerNorm] Implement RMS Norm
|
1 ano atrás |
Tri Dao
|
a8cfe51551
Implement Tensor Parallel for transformer Block
|
2 anos atrás |
Tri Dao
|
5db330519a
[LayerNorm] Support taking subset of input or subset of output
|
2 anos atrás |
Tri Dao
|
ae137ed17a
[LayerNorm] Fuse LayerScale
|
2 anos atrás |
Tri Dao
|
8c6609ae1a
[LayerNorm] Support all dimensions up to 6k (if divisible by 8)
|
2 anos atrás |
Tri Dao
|
fa6d1ce44f
Add fused_dense and dropout_add_layernorm CUDA extensions
|
2 anos atrás |