A small auto-diffing neural network example written in Zig. Heavily inspired by micrograd. The original scalar-valued version can be found under the scalar tag. The version in main
is vector-valued using a mini-numpy implemented in src/ndarray.zig
It's a toy, written just for learning purposes!
The src/main.zig file implements an MLP model used to classify hand-drawn digits. It achieves roughly 96% accuracy after training.
A reference implementation of pretty much the same thing can be found in pytorch/mnist.py.
- Initial setup:
python download_mnist.py
to download the MNIST dataset - Start training:
zig build run -Doptimize=ReleaseFast
Most of the time you want to be running in ReleaseFast mode, as the default debug build is a lot slower.