Skip to content

Residual neural network code implemented now

Compare
Choose a tag to compare
@cslr cslr released this 19 Feb 09:39
· 134 commits to RBM_test since this release

This is important update from private repo. It enables residual neural network (enabled in nntool). Code automatically uses skip connectiosn from even numbered layers forward if number of neurons for both layers is same. It therefore does mini skips over two layers from the input layer all the way to output layer and implements residual neural network architecture.

Deep learning: for a simple test problem (test_data_residual.sh) neural network can learn the problem with 40 layers in 10 minutes (dense residual neural network with leaky rectifier unit non-linearity), 20 layers residual neural network gives perfect results.