/BigBiGAN-TensorFlow2.0

TensorFlow 2.0 implementation of Large Scale Adversarial Representation Learning(BigBiGAN)

Primary LanguagePython

BigBiGAN-TensorFlow2.0

Introduction

An unofficical low-resolution (32 x 32) implementation of BigBiGAN
Paper: https://arxiv.org/abs/1907.02544

Architecture

bigbigan

Dependancy

Python 3.6
TensorFlow 2.0.0
Matplotlib 3.1.1
Numpy 1.17

Implementation

Generator and discriminator

biggan Discriminator F and Generator G come from BigGAN. There are 3 residual blocks in G and 4 residual blocks in F.
Discriminator H und J are 6-layer MLP(units=64) with skip-connection.

Encoder

As suggested in the paper, take higher resolution input (64 x 64) and RevNet(13-layers) as backend.
After RevNet, a 4-layer MLP(Unit=256) is taken.
The Reversible Residual Network: Backpropagation Without Storing Activations(https://arxiv.org/abs/1707.04585)

Usage

Set up the flags in main.py. In terminal, enter python3 main.py to execute the training.

Supported dataset:

MNIST, Fashion-MNIST and CIFAR10.

Conditional und unconditional GAN:

Conditional GAN will use the labels in the corresponding dataset to generate class-specific images.

Channels und batch size:

Change them to fit in your GPU according to your VRAM(>=6 GB recommended).

Conditional generation examples

MNIST mnist Fashion-MNIST fmnist CIFAR10 cifar10

TODO

  • Stochastic encoder
  • Projection discriminator

Known Issue

Encoder may not work properly and and map generated images to the same latent vector.(Stochatic encoder may help)

Reference

BigGAN https://github.com/taki0112/BigGAN-Tensorflow
RevNet https://github.com/google/revisiting-self-supervised

Authors