Neural network experiments written purely in numpy
- Learning backprop with the MNIST classification task
- Better version of learning backprop...lacks markdown explanations...doesn't have a bug though
- Synthetic gradients with the MNIST classification task
- Jupyter notebook
- also check out the minimalist 145-line Gist for this project
- inspired by this Google DeepMind paper
- Hebbian learning (for a Dartmouth class)
- Jupyter notebook
- For Human Memory (PSYC 051.09) taught by Jeremy Manning
- "U loss" learning
- I test an ansatz for layer-wise training of neural networks. It didn't work. That's how research goes.
- Folder is here