This is the code for the paper "Unsupervised learning of digit recognition using spike-timing-dependent plasticity" available at http://journal.frontiersin.org/article/10.3389/fncom.2015.00099/abstract#

To run the simulations you also need to download the MNIST dataset from http://yann.lecun.com/exdb/mnist/ and install the brian simulator (the easiest way is to run the following command from a shell "easy_install brian", otherwise see http://briansimulator.org/docs/installation.html). 

Testing with pretrained weights:
First run the main file "Diehl&Cook_spiking_MNIST.py" (which by default uses the pretrained weights) and wait until the simulation finished (on recent hardware this should take about 30min). After the simulation finished, you can run "Diehl&Cook_MNIST_evaluation.py" to evaluate the result, which should show a performance of 91.56%.

Training a new network:
At first you have to modify the main file "Diehl&Cook_spiking_MNIST.py" by changing line 210 to "test_mode = False" to train the network. The resulting weights will be stored in the subfolder "weights". Those weights can then be used to test the performance of the network by running a simulation using "Diehl&Cook_spiking_MNIST.py" with line 210 changed back to "test_mode = True". After the simulation finished, the performance can be evaluated using "Diehl&Cook_MNIST_evaluation.py" and should be around 89% using the given parameters. 


If you have any questions don't hesitate to contact me via peter.u.diehl@gmail.com.



Note:
In this simple demo the performance is evaluated using neuron assignments of the test set. This leads to a slight increase in performance but violates good practice in machine learning. The results presented in the paper did NOT use the test set to determine neuron assignments. Instead the assignments were generated by running the same script in testing mode but using the 60000 examples of the training set to determine neuron assignments (this can be done by changing line 85 in "Diehl&Cook_MNIST_evaluation.py" to training_ending = '60000').