CyberZHG/keras-radam

Usage for Tensorflow 2.0 minimize with var_list

Uiuran opened this issue · 6 comments

Since the documentation for usage of OptimizerV2.minimize() is lacking, it is hard to realize how to use it, as the function does not query trainable_variables automatically anymore and requires var_list to be given.

My DNN to the date used TF 1.14, i would prefer to see RAdam working in it before migrating all code (production demands), or else i would give up on using it for any other scheme of adaptative learning rate.

and i think it's more for example usage rather than bug, it's only because Keras still does not provide documentation for OptimizerV2

You don't need to know anything about OptimizerV2 if you are using Keras.

exactly, iam not using anything of keras (except pre-build Conv2D) in my code, i dont plan to use only Keras.

However OptimizerV2 still being ported in keras lib module that is now part of the core. So that's why i said Keras does not provide doc.

Otherwise OptimizerV2 is the same as tf.train.Optimizer.

a ... unfortunately it's not, in tf2.0 it does requires var_list, but in 1.14 it doesnt.

Iam trying to downgrade your code but i got the following:

RuntimeError: Override _create_vars instead of _create_slots when descending from OptimizerV2 (class RAdam)

If you can help me in this downgrading, at last trying to point out some change, i will be very thankful. But if you cant i will use RAdam only in the future upgrade to TF2.0beta in my software...
thks for now.

See #18, TensorFlow version of the optimizer was added in 0.8.0:

from keras_radam.training import RAdamOptimizer

RAdamOptimizer(learning_rate=1e-3)