/ad-mnist

Exploration of typed functional programming for machine learning

Primary LanguageNixMIT LicenseMIT

ad-mnist

Exploring some interesting topics via a simple neural network classifier for MNIST.

  • Dependent types enable static checking of matrix dimensions, making implementation of e.g. backprop much less error prone. Here I use Numeric.LinearAlgebra.Static in the hmatrix package.
  • Automatic differentiation to avoid writing manual code for backpropagation. While improved by the use of dependent types, manual implementation is still relatively error prone, tedious, and creates substantial friction when experimenting with different model architectures. Initially I'm experimenting with the ad package. Another interesting one is backprop.
  • GPU acceleration (future goal): accelerate seems like an interesting choice here (but seems not possible to use with existing AD libraries, yet)

Inspired by some of Justin Le's blog posts on the subject.