keras-team/keras-contrib

Add AdaMod optimizer

mffigueroa opened this issue · 0 comments

I'd like to add the AdaMod optimizer to the keras-contrib optimizers.
Paper reference: https://arxiv.org/abs/1910.12249
I modified the Adam optimizer code from the main keras repo and added the exponential averaging of past learning rates via the beta_3 coefficient and clamping of learning rates as described by the paper.
Here is my current branch:
https://github.com/mffigueroa/keras-contrib/commits/user/mffigueroa/adamod