Skip to content

Implementing backpropogation as used in Pytorch modules from scratch in python. Derived from Andrej's micrograd.

License

Notifications You must be signed in to change notification settings

LawJarp-A/my-autograd

Repository files navigation

my-autograd

Documentation at https://lawjarp-a.github.io/my-autograd/

Install

pip install my_autograd

How to use

Derive from the class Value. Use that to define the scalar values that you will use to implement backpropagation. Use the backward() method to compute the gradients.

# Define a Linear layer
w = Value(3.0, label='w')
b = Value(2.0, label='b')
x = Value(1.0, label='x')

z = (w * x) + b; z.label = 'z'

y = z.relu(); y.label = 'y'
# Visualize the graph
draw_dag(y)

# Call the backward method to compute gradients
y.backward()
# Visualize the graph again
draw_dag(y)

About

Implementing backpropogation as used in Pytorch modules from scratch in python. Derived from Andrej's micrograd.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published