/VAE-CVAE-MNIST

Variational Autoencoder and Conditional Variational Autoencoder on MNIST in PyTorch

Primary LanguageJupyter Notebook

Variational Autoencoder & Conditional Variational Autoenoder on MNIST

VAE paper: Auto-Encoding Variational Bayes

CVAE paper: Learning Structured Output Representation using Deep Conditional Generative Models


In order to run conditional variational autoencoder, add --conditional to the the command. Check out the other commandline options in the code for hyperparameter settings (like learning rate, batch size, encoder/decoder layer depth and size).


Results

All plots obtained after 10 epochs of training. Hyperparameters accordning to default settings in the code; not tuned.

z ~ q(z|x) and q(z|x,c)

The modeled latent distribution after 10 epochs and 100 samples per digit.

VAE CVAE

p(x|z) and p(x|z,c)

Randomly sampled z, and their output. For CVAE, each c has been given as input once.

VAE CVAE