awjuliani/DeepRL-Agents

What is `grad_norms` in AC_Network?

yrlu opened this issue · 2 comments

yrlu commented

Hi,

I come across your A3C implementation and find the following 2 lines in AC_network.py:

self.var_norms = tf.global_norm(local_vars)
grads,self.grad_norms = tf.clip_by_global_norm(self.gradients,40.0)

I wonder what's grad_norms for? It seems to me that it is not used.

Thanks!

This is the norm value of gradients of the network, by monitoring this value, we know how much changes are applied to the network parameters during training

In case of gradient explosion, we usually use a simple trick, namely gradient clipping, to prevent too huge changes to the network parameters.

In the case above, 40.0 is used for clipping

Try to use different values (10, 20, 50, 80) and see the effect on the training and share the results if you like :)

yrlu commented

Thanks @IbrahimSobh , it helps!