/wae

Wasserstein Auto-Encoders

Primary LanguagePythonBSD 3-Clause "New" or "Revised" LicenseBSD-3-Clause

Repository info

This project implements an unsupervised generative modeling technique called Wasserstein Auto-Encoders (WAE), proposed by Tolstikhin, Bousquet, Gelly, Schoelkopf (2017).

Repository structure

wae.py - everything specific to WAE, including encoder-decoder losses, various forms of a distribution matching penalties, and training pipelines

run.py - master script to train a specific model on a selected dataset with specified hyperparameters

Example of output pictures

The following picture shows various characteristics of the WAE-MMD model trained on CelebA after 50 epochs:

WAE-MMD progress