Skip to content

Neural network in vanilla Python, no PyTorch, no NumPy.

Notifications You must be signed in to change notification settings

AndyyyYuuu/nn-from-scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

44 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🤖 Neural Network From Scratch: anndy

A painfully simple autograd engine built in vanilla Python, no PyTorch, no numpy, just math.


🐍 Try It Out

Neural network utility classes and functions in anndy.py.

  • Dependencies: none.

For a simple demo on a regression task, run demo.py.

  • Dependencies: matplotlib, tqdm

🧠 Example Usage

Initialization:

nn = anndy.MLP((8, "tanh"), (4, "tanh"), (2, "relu"), (1, "relu"))  # Initialize multi-layer perceptron

Optimization loop:

preds = [nn(i) for i in data_y]  # Forward pass
loss = anndy.mean_squared_error(data_y, preds)  # Compute loss

nn.zero_grad()  # Reset gradients
loss.backward()  # Backward pass
nn.nudge(0.001)  # Descend gradient

🔗 Links & Sources

About

Neural network in vanilla Python, no PyTorch, no NumPy.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages