An unofficical low-resolution (32 x 32) implementation of BigBiGAN
Paper: https://arxiv.org/abs/1907.02544
Python 3.6
TensorFlow 2.0.0
Matplotlib 3.1.1
Numpy 1.17
Discriminator F and Generator G come from BigGAN.
There are 3 residual blocks in G and 4 residual blocks in F.
Discriminator H und J are 6-layer MLP(units=64) with skip-connection.
As suggested in the paper, take higher resolution input (64 x 64) and RevNet(13-layers) as backend.
After RevNet, a 4-layer MLP(Unit=256) is taken.
The Reversible Residual Network: Backpropagation Without Storing Activations(https://arxiv.org/abs/1707.04585)
Set up the flags in main.py
. In terminal, enter python3 main.py
to execute the training.
MNIST, Fashion-MNIST and CIFAR10.
Conditional GAN will use the labels in the corresponding dataset to generate class-specific images.
Change them to fit in your GPU according to your VRAM(>=6 GB recommended).
- Stochastic encoder
- Projection discriminator
Encoder may not work properly and and map generated images to the same latent vector.(Stochatic encoder may help)
BigGAN https://github.com/taki0112/BigGAN-Tensorflow
RevNet https://github.com/google/revisiting-self-supervised
- ZHONG Liangyu, https://github.com/LEGO999