This is a naive implementaion of softmax classifier with cross entropy loss functioon Will add jupyter notebook version soon planning to write a blog on this
OUTPUT
b= [ 0.56804456 0.92559664 0.07103606]
w= [[ 0.5488135 0.71518937 0.60276338] [ 0.54488318 0.4236548 0.64589411] [ 0.43758721 0.891773 0.96366276] [ 0.38344152 0.79172504 0.52889492]]
guess= [ 0.11491955 0.58894483 0.29613562]
actual= [ 1. 0. 0.]
guess= [ 0.02417515 0.68201728 0.29380758]
actual= [ 0. 1. 0.]
guess= [ 0.01047284 0.73688261 0.25264455]
actual= [ 0. 0. 1.]
error=127.57215588040505%
Running classifier....
b= [ 0.89518099 1.27909495 -0.60959868]
w= [[ 1.32442085 1.15762669 -0.61528129] [ 2.08761584 0.19311681 -0.66630056] [-1.45854048 0.69743801 3.05412545] [-0.50847225 -0.08447748 2.2970112 ]]
guess= [ 9.30802089e-01 6.91900641e-02 7.84694703e-06]
actual= [ 1. 0. 0.]
guess= [ 0.03034742 0.83405964 0.13559294]
actual= [ 0. 1. 0.]
guess= [ 0.0009035 0.26344645 0.73565006]
actual= [ 0. 0. 1.]
error=17.27467814880828%