Hezi-Resheff/Oreilly-Learning-TensorFlow

update global_step

raytroop opened this issue · 0 comments

train_step = tf.train.GradientDescentOptimizer(learningRate).minimize(loss)

argument 'global_step = global_step' should be inserted at 'minimize()'.
Only in this way, global_step increments one by one every training iteration and weights decay

# Learning rate decay
global_step = tf.Variable(0, trainable=False)
learningRate = tf.train.exponential_decay(learning_rate=0.1,
                                          global_step=global_step,
                                          decay_steps=1000,
                                          decay_rate=0.95,
                                          staircase=True)
train_step = tf.train.GradientDescentOptimizer(learningRate).minimize(loss, global_step= global_step)