/FaceGen-GAN

Conditional face generation experiments using GAN models on CelebA dataset.

Primary LanguagePythonMIT LicenseMIT

FaceGen-GAN

Conditional face generation experiments using GAN models on CelebA dataset.

Architectures

  • Vanilla DCGAN: a normal DCGAN as described in DCGAN paper, has training stability issues.
  • Hinge DCGAN with custom layers: an improved DCGAN with spectral normalization, self-attention, minibatch std and pixelwise normalization, which allows stable training with better visual results than DCGAN.

EMA Training

In order to improve generated image quality, it is also possible to train a model using exponential moving average (EMA) update, as defined in The Unusual Effectiveness of Averaging in GAN Training paper.

The code is based on the update function found here, which allows updating a second generator model's weights using EMA update using the following equation:

wt+1 = (1 - b) * ut + (b) * wt (assuming u are the weights of a generator trained via gradient methods)