Commit History

Автор SHA1 Съобщение Дата
  Tri Dao bcd918f275 [LayerNorm] Add option to write result to out and residual_out преди 4 месеца
  Tri Dao bd82d6c6eb Revert "[LayerNorm] Don't store x + residual if we don't need gradients" преди 4 месеца
  Tri Dao 800401847e [LayerNorm] Don't store x + residual if we don't need gradients преди 4 месеца
  Tri Dao 36587c01cb [LayerNorm] Update layer_norm_linear преди 9 месеца
  Tri Dao bdcae547c7 [LayerNorm] Don't exit early in the backward pass (fix #781) преди 10 месеца
  Tri Dao c9861a032d [LayerNorm] Initialize mean and rstd tensor using x.device преди 11 месеца
  Tri Dao f5b308e258 [LayerNorm] Rename layernorm.py -> layer_norm.py преди 11 месеца