Releases: cslr/dinrhiw2
Version v0.90 release
Version 0.80 release
Version 0.80 release.
Residual neural network code now kinda works and allows use of 40 layer neural network although results are not very good. Also small bugfixes here and there.
Residual neural network code implemented now
This is important update from private repo. It enables residual neural network (enabled in nntool). Code automatically uses skip connectiosn from even numbered layers forward if number of neurons for both layers is same. It therefore does mini skips over two layers from the input layer all the way to output layer and implements residual neural network architecture.
Deep learning: for a simple test problem (test_data_residual.sh) neural network can learn the problem with 40 layers in 10 minutes (dense residual neural network with leaky rectifier unit non-linearity), 20 layers residual neural network gives perfect results.
dinrhiw2-private-repo-sync-gcc9
Long waited dinrhiw2-private repo sync to fix bugs and implement TSNE to work ok with small number of data points.
Code requires GCC 9.* compiler as GCC 8.* random_device has bugs which break class RNG random number generation code.