/WGAN-GP-DRAGAN-Celeba-Pytorch

Pytorch implementation of WGAN-GP and DRAGAN

Primary LanguagePythonMIT LicenseMIT

Recommendation


WGAN-GP and DRAGAN

Pytorch implementation of WGAN-GP and DRAGAN, both of which use gradient penalty to enhance the training quality. We use DCGAN as the network architecture in all experiments.

WGAN-GP: Improved Training of Wasserstein GANs

DRAGAN: On Convergence and Stability of GANs

Exemplar results

Celeba

left: WGAN-GP 100 epoch, right: DRAGAN 100 epoch

Loss curves (very stable training!!!)

left: WGAN-GP 100 epoch, right: DRAGAN 100 epoch

Prerequisites

Usage

Configuration

You can directly change some configurations such as gpu_id and learning rate etc. in the head of each code.

Train

python train_celeba_wgan_gp.py
python train_celeba_dragan.py
...

Tensorboard

If you have installed tensorboard, you can use it to have a look at the loss curves.

tensorboard --logdir=./summaries/celeba_wgan_gp --port=6006
...

Datasets

  1. Celeba should be prepared by yourself in ./data/img_align_celeba/img_align_celeba/