/micrograd

Primary LanguageJupyter Notebook

Backpropogation

Implement backpropagation (reverse-mode autodiff) over a dynamically built DAG and a small neural networks library on top of it with a PyTorch-like API. The DAG only operates over scalar values, so e.g. we chop up each neuron into all of its individual tiny adds and multiplies