github-pengge/PyTorch-progressive_growing_of_gans

output sample images are incorrectly normalized

jcpeterson opened this issue · 6 comments

Some of the sample images looked washed out. I suspect that the min/max pixel values of the real samples versus the generated ones are different. It fluctuates wildly on every output. This makes it hard to verify quality during training most of the time.

Here's a quick hack to fix:

half = samples.shape[1] / 2
samples[:,:half,:] = samples[:,:half,:] - np.min(samples[:,:half,:])
samples[:,:half,:] = samples[:,:half,:] / np.max(samples[:,:half,:])
samples[:,half:,:] = samples[:,half:,:] - np.min(samples[:,half:,:])
samples[:,half:,:] = samples[:,half:,:] / np.max(samples[:,half:,:])

although I also removed the white spacing

this doesn't seem to be fully working for some reason. not sure why

So do I. Cannot figure out why.

Something like this seems to remove outlier values and fix the problem:

half = samples.shape[1] / 2

sd_fake = np.std(samples[:,:half,:])
m_fake = np.mean(samples[:,:half,:])
margin = m_fake + (sd_fake*4)

samples[np.where(samples[:,:half,:] > margin)] = margin
samples[np.where(samples[:,:half,:] < -margin)] = -margin