SimpleBackprop
Inspired by this post I deciced to write a simple backprop algorithm in Julia.
The two files show a simple 2 layer and 3 layer ANN for solving a simple classification problem. Sigmoid activation units are used with error weighted derivatives.
UPDATE:
As of 9/25/2015 dropout and gradient descent have been implemented.
Features looking to add include:
- Parameterized layer sizes
- Momentum
- Mini-batches