Skip to content

Repository for notes, projects and snippets on NODEs. Includes results after training CNN based networks with different methods on MNIST, CIFAR-10, CelebA and CatsAndDogs datasets accordingly.

License

Notifications You must be signed in to change notification settings

xAlpharax/NeuralODE-Notes-Projects

Repository files navigation

Neural Ordinary Differential Equations

Repository for notes, projects and snippets on NODEs. Includes results after training CNN based networks with different methods on MNIST, CIFAR-10, CelebA and CatsAndDogs datasets accordingly.

Introduction

Neural ODEs are a method of extending Residual Neural Networks for smoother gradients and general improvements in training. The original paper (2018) Neural Ordinary Differential Equations and later on ANODE will be papers cited for their techniques and algorithms.

In addition, Neural ODEs for undergraduate students has a great introduction into the subject of matter while Towards Understanding Normalization in Neural ODEs explains new advancements in Normalizing NODEs.

Installation

#cloning the repo
git clone https://github.com/xAlpharax/NeuralODE-Notes-Projects

#install dependencies
pip3 install -qq -r requirements.txt

Training

Training the network created in dcodnn.py can be done by running the following command. Note that the weights and visuals are placed in their separate directories respectively. Furthermore, the celebA dataset and attributes can be downloaded from img_align_celeba.zip & list_attr_celeba.txt and placed in drive directory.

#training large DCODNN for 15 epochs with the given parameters (CELEBA)
python3 train-node/train-celeb.py

#training medium DCODNN for 30 epochs with the given parameters (CIFAR)
python3 train-node/train-cifar-10.py

#training lighter DCODNN for 5 epochs with the given parameters (MNIST)
python3 train-node/train-mnist.py

Moreover, training residual networks with similar structure as the ones above is done by:

#training large DRCNN for 15 epochs with the given parameters (CELEBA)
python3 train-res/train-res-celeb.py

#training medium DRCNN for 30 epochs with the given parameters (CIFAR)
python3 train-res/train-res-cifar-10.py

#training lighter DRCNN for 5 epochs with the given parameters (MNIST)
python3 train-res/train-res-mnist.py

In regards to interractive training, there is a Colab Notebook for just that.

Results and comparison

CELEBA CIFAR MNIST
DCODNN val_acc: 0.7457 val_acc: 0.7059 val_acc: 0.9865
DRCNN val_acc: 0.7350 val_acc: 0.7545 val_acc: 0.9824

CelebA Accuracy

Additional weights

CelebA weights are large files(~30MB) and so would slow down the repository. Download from drive. Here you can also find the fine-tuned DCODNN trained on more epochs and smaller learning rate(5e-3)


TO DO:

  1. catsVSdogs dataset node & resnet

View on trello


Contributing

In case someone would like this, pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

About

Repository for notes, projects and snippets on NODEs. Includes results after training CNN based networks with different methods on MNIST, CIFAR-10, CelebA and CatsAndDogs datasets accordingly.

Topics

Resources

License

Stars

Watchers

Forks

Languages