Skip to content

codesmaug/Neural-Networks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 

Repository files navigation

                                                Neural Network from Scratch with NumPy

Overview This project involves building a neural network from scratch using NumPy. The neural network is designed for classification tasks and includes functionalities for forward propagation, backward propagation, training, prediction, and visualization.

Features Customizable Architecture: Define the neural network structure with any number of layers and nodes. Activation Functions: Includes ReLU and Sigmoid activation functions. Training: Implements forward and backward propagation for training the network. Visualization: Visualize the neural network architecture using Matplotlib. Directory Structure:

├── neuro.py # Implementation of the NeuralNetwork class ├── image.py # Basic script to visualize the neural network ├── image2.py # Improved script to visualize the neural network with large nodes ├── test.py # Basic script to test the neural network ├── test2.py # Improved script with if __name__ == "__main__" ├── notes.txt # Project notes └── README.md # Project README file

Usage Testing the Neural Network:

1- The test.py script demonstrates training the neural network with a simple XOR problem. For an improved version with the if __name__ == "__main__" clause, see test2.py.

Visualizing the Neural Network:

2- The image.py script visualizes the neural network architecture. For an improved version that handles large numbers of nodes in hidden layers, see image2.py.

Examples:

Image

test.py and test2.py outputs:

NN2646464642 NN2141414142

The best result:

NN2646464642CL

Project Notes Detailed notes about the project's design, features, challenges, and solutions are documented in notes.txt.

Future Enhancements Additional Activation Functions: Implement more activation functions such as Tanh, Leaky ReLU, etc. Regularization Techniques: Add regularization methods to prevent overfitting. Advanced Optimizers: Implement advanced optimization algorithms like Adam, RMSprop, etc. Hyperparameter Tuning: Provide a mechanism for automated hyperparameter tuning. Acknowledgements Inspired by various online tutorials and documentation on neural networks and machine learning. Thanks to the open-source community for providing the tools and libraries that made this project possible.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages