Skip to content

samuelt0/autocpp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AutoCPP

A lightweight reverse-mode automatic differentiation (autograd) engine implemented in C++ with Python bindings.

Features

  • Core Tensor Operations: Addition, subtraction, multiplication, division, power
  • Mathematical Functions: sin, cos, exp, log, tanh
  • Activation Functions: ReLU, Sigmoid
  • Automatic Differentiation: Reverse-mode autodiff with gradient accumulation
  • Python Integration: Seamless Python API via Pybind11
  • Operator Overloading: Natural mathematical syntax in Python

Installation

Prerequisites

  • C++14 compatible compiler
  • Python 3.6+
  • CMake 3.12+
  • pybind11

Building from Source

# Install pybind11
pip install pybind11

# Build and install the package
pip install .

# Or for development
pip install -e .

Running Tests

# C++ tests
mkdir build && cd build
cmake ..
make
./test_autocpp

# Python tests
python tests/test_python.py

# Or with pytest
pip install pytest
pytest tests/test_python.py

Usage

from autocpp import Tensor

# Create tensors with gradient tracking
x = Tensor(2.0, requires_grad=True)
y = Tensor(3.0, requires_grad=True)

# Forward pass
z = x * y + x.sin()

# Backward pass
z.backward()

# Access gradients
print(f"dz/dx = {x.grad}")  # Gradient with respect to x
print(f"dz/dy = {y.grad}")  # Gradient with respect to y

Neural Network Example

from autocpp import Tensor

# Simple neural network layer
x = Tensor(1.0, requires_grad=True)
w1 = Tensor(0.5, requires_grad=True)
b1 = Tensor(0.1, requires_grad=True)
w2 = Tensor(-0.3, requires_grad=True)
b2 = Tensor(0.2, requires_grad=True)

# Forward pass
h = (x * w1 + b1).tanh()  # Hidden layer with tanh activation
y = h * w2 + b2            # Output layer

# Compute gradients
y.backward()

# All parameters now have gradients
print(f"dL/dw1 = {w1.grad}")
print(f"dL/db1 = {b1.grad}")
print(f"dL/dw2 = {w2.grad}")
print(f"dL/db2 = {b2.grad}")

API Reference

Tensor Class

Tensor(value: float, requires_grad: bool = False)

Attributes:

  • value: The scalar value
  • grad: Accumulated gradient
  • requires_grad: Whether to track gradients

Methods:

  • backward(): Compute gradients via backpropagation
  • zero_grad(): Reset gradients to zero

Operations:

  • Arithmetic: +, -, *, /, **, - (negation)
  • Trigonometric: sin(), cos()
  • Exponential: exp(), log()
  • Activation: tanh(), relu(), sigmoid()

Engine Class

Engine.backward(tensor)  # Run backpropagation
Engine.zero_grad(tensors)  # Zero gradients for list of tensors

About

Reverse-mode autograd in C++

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •