Changing to an optimizer that learns variables results in an exception
dmitrinesterenko opened this issue · 1 comments
Adopting Adam or AdaGrad optimizers results in the following exceptions because the support variables needed for these optimizers are not yet initialized. Read more here
Caused by op 'minimize_loss/update_Projection/bs/ApplyAdagrad', defined at:
File "rnn.py", line 455, in
test_RNN()
File "rnn.py", line 437, in test_RNN
stats = model.train(verbose=True)
File "rnn.py", line 382, in train
train_acc, val_acc, loss_history, val_loss = self.run_epoch(new_model=True)
File "rnn.py", line 314, in run_epoch
init = tf.global_variables_initializer()
File "rnn.py", line 253, in training
train_op = trainer.minimize(loss, name="minimize_loss")
File "/home/dmitri/anaconda2/envs/py35/lib/python3.5/site-packages/tensorflow/python/training/optimizer.py", line 325, in minimize
name=name)
File "/home/dmitri/anaconda2/envs/py35/lib/python3.5/site-packages/tensorflow/python/training/optimizer.py", line 456, in apply_gradients
update_ops.append(processor.update_op(self, grad))
File "/home/dmitri/anaconda2/envs/py35/lib/python3.5/site-packages/tensorflow/python/training/optimizer.py", line 97, in update_op
return optimizer._apply_dense(g, self._v) # pylint: disable=protected-access
File "/home/dmitri/anaconda2/envs/py35/lib/python3.5/site-packages/tensorflow/python/training/adagrad.py", line 80, in _apply_dense
use_locking=self._use_locking)
File "/home/dmitri/anaconda2/envs/py35/lib/python3.5/site-packages/tensorflow/python/training/gen_training_ops.py", line 82, in apply_adagrad
grad=grad, use_locking=use_locking, name=name)
File "/home/dmitri/anaconda2/envs/py35/lib/python3.5/site-packages/tensorflow/python/framework/op_def_library.py", line 767, in apply_op
op_def=op_def)
File "/home/dmitri/anaconda2/envs/py35/lib/python3.5/site-packages/tensorflow/python/framework/ops.py", line 2506, in create_op
original_op=self._default_original_op, op_def=op_def)
File "/home/dmitri/anaconda2/envs/py35/lib/python3.5/site-packages/tensorflow/python/framework/ops.py", line 1269, in init
self._traceback = _extract_stack()
FailedPreconditionError (see above for traceback): Attempting to use uninitialized value Projection/bs/adagrad_optimizer
[[Node: minimize_loss/update_Projection/bs/ApplyAdagrad = ApplyAdagrad[T=DT_FLOAT, _class=["loc:@Projection/bs"], use_locking=false, _device="/job:localhost/replica
:0/task:0/cpu:0"](Projection/bs, Projection/bs/adagrad_optimizer, minimize_loss/learning_rate, gradients/add_6_grad/tuple/control_dependency_1)]]
Fixed in #34