Jethro's Braindump

LARS Optimizer

Layer-wise Adaptive Rate Scaling (LARS) is a Neural Network Optimizer. The technique allows Large Batch Training without significant decrease in accuracy You, Gitman, and Ginsburg, n.d.. One of the secondary goals is Fast Neural Network Training.

Implementations