Skip to content

Variational Autoencoder and a Disentangled version (beta-VAE) implementation in PyTorch-Lightning

License

Notifications You must be signed in to change notification settings

Bhavik-Ardeshna/pytorch-VAE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

pytorch-VAE

Variational Autoencoder and a Disentangled version (beta-VAE).

  • Variational Autoencdoer

    The Variational Autoencoder is a Generative Model. Its goal is to learn the distribution of a Dataset, and then generate new (unseen) data points from the same distribution.

  • beta Variational Autoencoder

    Another form of a Variational Autoencoder is the beta-VAE. The difference between the Vanilla VAE and the beta-VAE is in the loss function of the latter: The KL-Divergence term is multiplied with a hyperprameter beta. This introduces a disentanglement to the idea of the VAE, as in many cases it allows a smoother and more "continuous" transition of the output data, for small changes in the latent vector z. More information on this topic can be found in the sources section below.

Requirements

Execution

Installation

$ pip install -r requirements.txt

VAE:

$ python3 main.py -c <config_file_path> -v VAE

beta-VAE:

$ python3 main.py -c <config_file_path> -v B-VAE

Examples of configuration files can be found in the config directory.

Variational Autoencoder Paper

About

Variational Autoencoder and a Disentangled version (beta-VAE) implementation in PyTorch-Lightning

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages