CyberZHG/keras-radam

AttributeError: 'RAdam' object has no attribute 'apply_gradients'

bionicles opened this issue · 1 comments

Describe the Bug

The optimizer has a different API from other optimizers in TF.Keras so when we try to use it as a drop-in replacement for tf.keras.optimizers.Adam, it crashes

Version Info

  • yes, I'm using the latest version

Minimal Codes To Reproduce

import tensorflow as tf
import os
os.environ['TF_KERAS'] = '1'
from keras_radam import RAdam
optimizer = Radam()
inputs = get_task_inputs(xxxxxx)
with tf.GradientTape() as tape:
     y_pred = model(inputs)
     losses = loss_fn(y_true, y_pred)
gradients = tape.gradient(losses, model.trainable_variables)
optimizer.apply_gradients(zip(gradients, model.trainable_variables))

Traceback (most recent call last):
File "neuromax.py", line 67, in
results = [agent.train() for _ in range(MAX_LOOPS)]
File "neuromax.py", line 67, in
results = [agent.train() for _ in range(MAX_LOOPS)]
File "/home/bion/hax/neuromax/nature/agent.py", line 289, in train
for episode_number in range(EPISODES_PER_PRACTICE_SESSION)]
File "/home/bion/hax/neuromax/nature/agent.py", line 289, in
for episode_number in range(EPISODES_PER_PRACTICE_SESSION)]
File "/home/bion/hax/neuromax/nature/agent.py", line 288, in
for task_key, task_dict in self.tasks.items()]
File "/home/bion/hax/neuromax/nurture/clevr/clevr.py", line 56, in run_clevr_task
agent.train_op(task_id, inputs, loss_fn, y_true, priors)
File "/home/bion/hax/neuromax/nature/agent.py", line 257, in train_op
self.optimizer.apply_gradients(gradients_and_variables)
AttributeError: 'RAdam' object has no attribute 'apply_gradients'

Tensorflow version was added. Keep your codes unchanged and update the package to use it.