taesungp/swapping-autoencoder-pytorch

Training issue

zacharyclam opened this issue · 2 comments

Thanks for the awesome work.
I have used church dataset to train the model on a single RTX2080ti with batchsize of 4 over 1,850,000 iterations. But the result still has many artifacts in some complex places. Whether this is because the batch size is too small or simply because the number of iterations is not enough.
0002
image

Hello, I recommend training a bit longer. Here is the sample result of swapping by iteration. The quality keeps improving with more iterations, even after the default setting of 25M images. I think training with larger batch size should help, but I have not personally experimented with it.

image

@taesungp
Hi, Thanks for the awesome work.
It seems training will takes a long time. cloud you provide your training curve on the output of the discriminator for reference?