github-pengge/PyTorch-progressive_growing_of_gans

latent_size != 512 fails

jcpeterson opened this issue · 4 comments

Any latent_size other than 512 throws an error

Changing latent_size is possible, if you also change fmap_max at the same time. Latent_size will be seen as channels at the input block of Generator. So they must match.

I had fixed it.

Great. Looks like the change is needed for the BEGAN too

BEGAN did not work! I had change a lot of things, but still not working...