gan training
Closed this issue · 5 comments
1292765944 commented
Did you employ WGAN for training? I notice this clip_grad_norm_ in your code.
hytseng0509 commented
We don't employ WGAN in this code. The reason for applying gradient clipping is to avoid explosive gradient and thus stabilize the training.
1292765944 commented
@hytseng0509 Yes, the training process seems healthy using gradient clipping. Another question is what could we know with the GAN loss, or say, what should the disContent loss look like if we finish training the model.
hytseng0509 commented
We usually stop the training when the quality of the translated images stop improving or G_loss converges.
unlugi commented
Why is gradient clipping applied only for content discriminator gradients?
hytseng0509 commented
We found it stabilize the training of the content discriminator.