CGCL-codes/AMT-GAN

the question of training epoches

Yueming6568 opened this issue · 1 comments

Hi! Great job! We found the numbers of training epoches of your work is set to 5 in config.yaml, while the offical epoch of PSGAN is 50. So, would you check this and tell me why?
Thanks a lot, looking for your reply~

Hi!
This is because 5 epochs are enough to get loss converged at a certain level. Also, we don't want to have too much computation cost compared with the competitors in the paper.
But still, more training epochs will make the performance better. As for PSGAN. I think their goal is to have a perfect GAN for makeup transfer, the computation cost is not their primary concern I guess.