Documentation at https://lawjarp-a.github.io/my-autograd/
pip install my_autograd
Derive from the class Value
. Use that to define the scalar values that
you will use to implement backpropagation. Use the backward()
method
to compute the gradients.
# Define a Linear layer
w = Value(3.0, label='w')
b = Value(2.0, label='b')
x = Value(1.0, label='x')
z = (w * x) + b; z.label = 'z'
y = z.relu(); y.label = 'y'
# Visualize the graph
draw_dag(y)
# Call the backward method to compute gradients
y.backward()
# Visualize the graph again
draw_dag(y)