/Adabound-Optimizer

Implementation of "Adaptive Gradient Methods with Dynamic Bound of Learning Rate" method in tensorflow.

Primary LanguagePythonMIT LicenseMIT

Adabound-Optimizer

Implementation of "Adaptive Gradient Methods with Dynamic Bound of Learning Rate" method in tensorflow. References: "Adaptive Gradient Methods with Dynamic Bound of Learning Rate", "Adam - A Method for Stochastic Optimization" and Adam impelementation in tensorflow.