A tensorflow and keras implementation of AdaBound optimizer from 2019 paper by Luo. The official and original PyTorch code can be found here.
First add the adabound-tensorflow/keras to your project.
- Keras
from adabound_keras.adabound import AdaBound
optimizer = AdaBound(lr=0.001, final_lr=0.1)
- TensorFlow
from adabound_tensorflow.adabound import AdaboundOptimizer
optimizer = AdaboundOptimizer(learning_rate=0.001, final_lr=0.1)
- Still to come:
- Keras version
- TensorFlow version
- Add test demo
- Luo, et al. Adaptive Gradient Methods with Dynamic Bound of Learning Rate. In Proc. of ICLR 2019.
- Original Implementation (PyTorch)
- keras.optimizers
- tensorflow.adam