Pinned issues
Issues
- 1
- 2
No module name math_ops
#46 opened by deeptimittal97 - 7
AttributeError: 'TFOptimizer' object has no attribute 'learning_rate'
#41 opened by Delicious-Bitter-Melon - 1
How can i use radam in tf1.4?
#45 opened by HouGall - 1
ValueError: An operation has `None` for gradient. Please make sure that all of your ops have a gradient defined (i.e. are differentiable). Common ops without gradient: K.argmax, K.round, K.eval.
#44 opened by parkourcx - 3
Could not interpret optimizer identifier
#30 opened by connorlbark - 1
Warmup causes NAN
#43 opened by lyw615 - 1
Any other parameter needed when warmup?
#42 opened by lyw615 - 1
- 1
Very slow implementation
#39 opened by MaximusMutschler - 3
Unknown optimizer: RAdam
#16 opened by stvogel - 0
TF.keras
#38 opened by sonfire186 - 1
Generate First Release
#37 opened by Tata17 - 4
The keras backend support
#27 opened by flydsc - 1
Maintenance
#36 opened by RomainBrault - 3
amsgrad parameter
#33 opened by nicolaspanel - 3
Issue about the dtype
#31 opened by TianrongChen - 2
Ranger Optimizer extension
#32 opened by misrasaurabh1 - 1
epsilon not compatible with Adam
#29 opened by KarlisFre - 1
- 3
- 6
not compatable with tf 1.8 , AttributeError: 'RAdamOptimizer' object has no attribute '_call_if_callable'
#25 opened by zxzxzxygithub - 1
- 6
- 6
Usage for Tensorflow 2.0 minimize with var_list
#17 opened by Uiuran - 1
- 3
RAdam in Tensorflow
#18 opened by phamnam95 - 4
ValueError: ('Could not interpret optimizer identifier:', <keras_radam.optimizers.RAdam object at 0x7fd0dab35358>)
#12 opened by zirlman - 1
To use Warmup
#11 opened by huyhieupham - 1
Typo in README.md
#13 opened by timvink - 4
Do I need to tune learning rates?
#9 opened by xuzhang5788 - 4
Please add weight decay
#1 opened by sbarman25 - 2
- 5
- 1
min_lr isn't set properly
#10 opened by tfriedel