You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello fellas, I am not able to decrease the output error on subsequent iterations for Mse Error on back-propagation. For some values of initialised weights, the loss increases and then goes to Nan. For some values of weights initialisation, for all the iterations the loss is Nan.
If you want to check the maths, I have included two photos in the Loss Function Area in Readme.
I am using a random-initialisation and also normal-random-initialisation of matrices.
Check the code in src/classifier.c line 379, for the gradient of MSE loss function.
LOSS LOOKS LIKE THIS :
for normal-random initialisation for mean 0.1 and std = 2*sqrt(1/input).
input defines the number of rows or data points.
Hello fellas, I am not able to decrease the output error on subsequent iterations for Mse Error on back-propagation. For some values of initialised weights, the loss increases and then goes to Nan. For some values of weights initialisation, for all the iterations the loss is Nan.
If you want to check the maths, I have included two photos in the Loss Function Area in Readme.
I am using a random-initialisation and also normal-random-initialisation of matrices.
Check the code in
src/classifier.c
line 379, for the gradient of MSE loss function.LOSS LOOKS LIKE THIS :
for normal-random initialisation for mean 0.1 and std = 2*sqrt(1/input).
input defines the number of rows or data points.
000000: Loss: 0.451703
000001: Loss: 0.450800
000002: Loss: 0.451178
000003: Loss: 0.450949
000004: Loss: 0.450693
000005: Loss: 0.450952
000006: Loss: 0.450880
000007: Loss: 0.451021
000008: Loss: 0.450207
000009: Loss: 0.450087
000010: Loss: 0.450070
000011: Loss: 0.450410
000012: Loss: 0.450950
000013: Loss: 0.451022
000014: Loss: 0.450487
000015: Loss: 0.451102
000016: Loss: 0.450083
The text was updated successfully, but these errors were encountered: