/cycle-gan-tf

Reimplementation of cycle-gan(https://arxiv.org/pdf/1703.10593.pdf) with improved w-gan(https://arxiv.org/abs/1704.00028) loss in tensorflow.

Primary LanguagePythonMIT LicenseMIT

Cycle-GAN-TF

Reimplementation of cycle-gan with improved w-gan loss in tensorflow.

Prerequisites

  • Tensorflow v1.0

Training Result

  • Training is done with nVidia Titan X Pascal GPU.
  • aerial maps <-> maps loss graph
    • A(aerial map) -> B(map) -> A result of training examples(a->b->a)
    • B -> A -> B result of training examples(b->a->b)

Result on test sets

  • Each model trained 20000 steps(20000*8/1000 ~= about 160 epochs).

  • aerial maps <-> maps

  • horse <-> zebra

  • apple <-> orange

Training

Download dataset

./download_dataset.sh [specify a dataset you want]

Run code

Before running the code, change the paths and hyper-parameters as desired in the code.

python main.py

Using pretrained model & inference

Before running the code, change the paths as desired in the code.

python inference.py

Notes & Acknowledgement

  1. The code for download dataset was copied from here.
  2. Network architecture might slightly different from the original paper's one.
    • For instance, different D network (actually, C network in the Wasserstein gan) is used.
  3. Tensorflow does not support reflection padding for conv(and decov) layer, so some artifacts can be seen.