/improved_wgan_training

Code for reproducing experiments in "Improved Training of Wasserstein GANs"

Primary LanguagePythonMIT LicenseMIT

Improved Training of Wasserstein GANs

Code for reproducing experiments in "Improved Training of Wasserstein GANs".

Prerequisites

  • Python, NumPy, TensorFlow, SciPy, Matplotlib
  • A recent NVIDIA GPU

Models

Configuration for all models is specified in a list of constants at the top of the file. Two models should work "out of the box":

  • python gan_toy.py: Toy datasets (8 Gaussians, 25 Gaussians, Swiss Roll).
  • python gan_mnist.py: MNIST

For the other models, edit the file to specify the path to the dataset in DATA_DIR before running. Each model's dataset is publicly available; the download URL is in the file.

  • python gan_64x64.py: 64x64 architectures (this code trains on ImageNet instead of LSUN bedrooms in the paper)
  • python gan_language.py: Character-level language model
  • python gan_cifar.py: CIFAR-10