Skip to content
/ GIN Public

Code for the paper "Disentanglement by Nonlinear ICA with General Incompressible-flow Networks (GIN)" (2020)

License

Notifications You must be signed in to change notification settings

vislearn/GIN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GIN

Code for the paper "Disentanglement by Nonlinear ICA with General Incompressible-flow Networks (GIN)" (2020)

Prerequisites

Make sure you have numpy and pytorch installed (recommended versions below). Then install FrEIA:

pip install git+https://github.com/VLL-HD/FrEIA.git

Recommended package versions

The scripts in this repository were tested with the following package versions (may also work with earlier versions, eg python 3.7):

  • python 3.8.3
  • numpy 1.18.1
  • matplotlib 3.1.3
  • pytorch 1.5.0
  • torchvision 0.6.0
  • cudatoolkit 10.2.89

Tests were made with both CPU (artificial data only) and GPU (artificial data and EMNIST).

Usage

Clone the repository:

git clone https://github.com/VLL-HD/GIN.git
cd GIN

Artificial Data

To see the available options:

python artificial_data.py -h

Reconstructions are saved in ./artificial_data_save/{timestamp}/figures. Eight reconstructions are plotted, each corresponding to a different orientation of the reconstructed latent space.

Example reconstruction plot:

artificial_data_reconstruction_plot

EMNIST

To see the available options:

python emnist.py -h

Model checkpoints (.pt files) are saved in ./emnist_save/{timestamp}/model_save with the specified save frequency. Figures are saved in ./emnist_save/{timestamp}/figures whenever a checkpoint is made (including at the end of training).

Example plots:

emnist_spectrum emnist_first_dim

For further details please refer to the paper.

About

Code for the paper "Disentanglement by Nonlinear ICA with General Incompressible-flow Networks (GIN)" (2020)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages