This repository provides an educational implementation of a fully connected neural network (FNN) classifier using only NumPy. The implementation is highly customizable, allowing users to specify the number and size of hidden layers, various activation functions (ReLU, sigmoid, tanh), and optimization techniques (Gradient Descent, Momentum, Adam). It also supports L2 regularization and mini-batch training.
-
Notifications
You must be signed in to change notification settings - Fork 0
Implementation of an L-Layer Neural Network
License
MiguelGarcaoSilva/FNN-Numpy
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
About
Implementation of an L-Layer Neural Network
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published