-
Simple and Modular Neural Network Library built in python built using just numpy.
-
Keras like Network Initialization.
-
Easy to understand and build upon.
-
Activation Functions
-
Linear
-
Sigmoid
-
Relu
-
Softmax
-
-
Optimizer
- Adam
-
Loss Functions
- Cross Entropy
import tinyNN as tnn
nn = tnn.NeuralNetwork()
nn.addLayer(2) #Input Layer (2 inputs)
nn.addLayer(6,tnn.activation_sigmoid) #Hidden Dense Layer
nn.addLayer(6,tnn.activation_sigmoid) #Hidden Dense Layer
nn.addLayer(6,tnn.activation_sigmoid) #Hidden Dense Layer
nn.addLayer(3,tnn.activation_softmax) #Output Layer
nn.compile(lr=1)
# To Train
nn.fit(Xs,Ys,epochs=5) #Train for 5 epochs
- Python3.6+
- numpy
- Clone this repo to your local machine using
git clone https://github.com/SuyashMore/tinyNeuralNet
- Weights and biases are stored as numpy Matrices
def activation_sigmoid(X,der=False):
if not der:
return np.divide(1, 1 + np.exp(-X) )
else:
#Return Derivative of the Activation
return np.multiply(X,(1-X))
- der Flag Represents the derivative of the Activation Function used during BackProp
- MIT license
- Copyright 2020 © SuyashMore.