/Optimizer-PyTorch

Package of Optimizer implemented with PyTorch .

Primary LanguagePythonApache License 2.0Apache-2.0

Optimizer-PyTorch

Package of Optimizer implemented with PyTorch .

Optimizer Lists

SGD: stochastic gradient descent

Adam: A Method for Stochastic Optimization

Adabound: Adaptive Gradient Methods with Dynamic Bound of Learning Rate

RAdam: On the Variance of the Adaptive Learning Rate and Beyond

Lookahead: Lookahead Optimizer: k steps forward, 1 step back

Optimistic

OptimAdam

OMD

ExtraGradient

STORM: STOchastic Recursive Momentum

Others