using different samples for gan generator and discriminator
teou opened this issue · 3 comments
Thank you for your great work!
I noticed that when training, the generator and discriminator are using different samples from the same dataloader:
generator:
https://github.com/lucidrains/magvit2-pytorch/blob/main/magvit2_pytorch/trainer.py#L289
discriminator:
https://github.com/lucidrains/magvit2-pytorch/blob/main/magvit2_pytorch/trainer.py#L327
is this by design?
@teou yea, so actually it is intentional, as when i used to build GANs, i've tried it this way but didn't see better results
do you know if in the paper they did it the way you describe? and are there papers that show this makes a difference?
no , i haven't done any experiments yet.
from other implementations , seems generator and discriminator is consuming the same sample from dataloader, that's how i got confused. ^_^
for example here:
https://github.com/dome272/VQGAN-pytorch/blob/main/training_vqgan.py#L55
@teou yup i know Dominic, very smart kid! i can take a peek at Robin's original implementation tomorrow and see if he did it the way you described. if he did, then let's just do that. however i don't think it makes that big of a difference either way