VAE CVAE MNIST Save

Variational Autoencoder and Conditional Variational Autoencoder on MNIST in PyTorch

Project README

Variational Autoencoder & Conditional Variational Autoenoder on MNIST

VAE paper: Auto-Encoding Variational Bayes

CVAE paper: Semi-supervised Learning with Deep Generative Models


In order to run conditional variational autoencoder, add --conditional to the the command. Check out the other commandline options in the code for hyperparameter settings (like learning rate, batch size, encoder/decoder layer depth and size).


Results

All plots obtained after 10 epochs of training. Hyperparameters accordning to default settings in the code; not tuned.

z ~ q(z|x) and q(z|x,c)

The modeled latent distribution after 10 epochs and 100 samples per digit.

VAE CVAE

p(x|z) and p(x|z,c)

Randomly sampled z, and their output. For CVAE, each c has been given as input once.

VAE CVAE
Open Source Agenda is not affiliated with "VAE CVAE MNIST" Project. README Source: timbmg/VAE-CVAE-MNIST
Stars
537
Open Issues
1
Last Commit
4 months ago

Open Source Agenda Badge

Open Source Agenda Rating