Vae Concrete Save

Keras implementation of a Variational Auto Encoder with a Concrete Latent Distribution

Project README

Variational Auto Encoder with Concrete Latent Distribution

Keras implementation of a Variational Auto Encoder with a Concrete latent distribution. See Auto-Encoding Variational Bayes by Kingma and Welling and The Concrete Distribution: A Continuous Relaxation of Discrete Random Variables by Maddison, Mnih and Teh or Categorical Reparameterization with Gumbel-Softmax by Jang, Gu and Poole.

Examples

Samples from a regular VAE

VAE with concrete latent distribution. Each column of the image corresponds to one of the categories of the latent concrete distribution.

Usage

Traditional VAE with a 2 dimensional latent distribution

>>> from vae_concrete import VAE
>>> model = VAE(latent_cont_dim=2)
>>> model.fit(x_train, num_epochs=20)
>>> model.plot()

You should see start seeing good results after ~5 epochs. The loss should approach ~140 upon convergence. Occasionally the optimization gets stuck in a poor local minimum and stays around ~205. In that case it is best to just restart the optimization.

VAE with 2 continuous variables and a 10 dimensional discrete distribution

>>> model = VAE(latent_cont_dim=2, latent_disc_dim=10)
>>> model.fit(x_train, num_epochs=10)
>>> model.plot()

This takes ~10 epochs to start seeing good results. Loss should go down to ~125.

Dependencies

  • keras
  • tensorflow (only tested on tensorflow backend)
  • plotly

Acknowledgements

Code was inspired by the Keras VAE implementation (plotting functionality was also borrowed and modified from this example)

Open Source Agenda is not affiliated with "Vae Concrete" Project. README Source: EmilienDupont/vae-concrete
Stars
51
Open Issues
1
Last Commit
6 years ago

Open Source Agenda Badge

Open Source Agenda Rating