/Simple-Spiking-Neural-Network-STDP

A simple from scratch implementation of a Spiking-Neural-Network with STDP in Python which is beeing trained on MNIST.

Primary LanguageJupyter NotebookApache License 2.0Apache-2.0

A simple Implementation of an SNN

Neurodynamics Project - Group 10

This project was created as part of the Neurodynamics lecture at the University OsnabrΓΌck, Germany. It contains a simple from scratch implementation of a Spiking-Neural-Network with STDP in Python

(back to top)

πŸ“– Table of Contents

(back to top)

❓ Why?

Artificial Neural Networks (ANNs) are only loosely inspired by the human brain while Spiking Neural Networks (SNNs) incorporate various concepts of it. Spike Time Dependent Plasticity (STDP) is one of the most commonly used biologically inspired unsupervised learning rules for SNNs.
In order to obtain a better understanding of SNNs we compared their performance in image classification to Fully-Connected ANNs using the MNIST dataset.
MNIST Example Images
For this to work, we had to transform the data for the SNN into rate-encoded spike trains. As a major part of our work, we provide a comprehensible implementation of an STDP-based SNN.

(back to top)

✨ Features

With the files we provided you can either train your own Spiking-Neural-Network or do inference on existing pretrained weights. For training you can either use the dataset we uploaded in the MNIST folder and subfolders or you can simply use the MNIST dataset provided by tensorflow. Therefore in the SNN.py file you can find examples for both, how to convert your own image data into spike trains and how to transform an existing tensorflow dataset into spike trains.

(back to top)

πŸ’» Usage

To use our code, you first have to install the requiered libraries from the requirements.txt.

 pip install -r requirements.txt

After this, you can train your own SNN.

 python3 main.py -mode training -use_tf_dataset

You can also use this script to test your own trained network and weights.

 python3 main.py -mode inference -weights_path folder/weights.csv -labels_path folder/labels.csv -image_inference_path folder/picture.png

To get a list of all possible hyperparameters use

 python3 main.py -h

(back to top)

πŸ’Ύ Structure

.
β”œβ”€β”€ src                    
β”‚   β”œβ”€β”€ MNIST                              # Here is the entire MNIST dataset          
β”‚   β”‚   β”œβ”€β”€ testing
β”‚   β”‚   β”‚   β”œβ”€β”€ 0                          # Each subfolder represents a class
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 3.png
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 10.png
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 13.png
β”‚   β”‚   β”‚   β”‚   ...
β”‚   β”‚   β”‚   β”œβ”€β”€ 1
β”‚   β”‚   β”‚   β”œβ”€β”€ 2
β”‚   β”‚   β”‚   β”œβ”€β”€ 3
β”‚   β”‚   β”‚   β”œβ”€β”€ 4
β”‚   β”‚   β”‚   β”œβ”€β”€ 5
β”‚   β”‚   β”‚   β”œβ”€β”€ 6
β”‚   β”‚   β”‚   β”œβ”€β”€ 7
β”‚   β”‚   β”‚   β”œβ”€β”€ 8
β”‚   β”‚   β”‚   β”œβ”€β”€ 9
β”‚   β”‚   β”œβ”€β”€ training
β”‚   β”‚   β”‚   β”œβ”€β”€ 0
β”‚   β”‚   β”‚   ...
β”‚   β”‚   β”œβ”€β”€ labels.csv
β”œβ”€β”€ Notebooks
β”‚   │── ANN_Comparison.ipynb          # Comparison ANNs being trained in Tensorflow
β”‚   │── Visualization_Helper.ipynb    # Visualization of our results
β”‚   │── Deprecated_Training.ipynb     # Old deprecated training notebook
β”œβ”€β”€ Pretrained              # Pretrained weights and labels for testing
β”‚   │── labels.csv
β”‚   │── weights.csv
│── .gitignore
│── main.py                 # Main file for executing training/inference the SNN
│── Neuron.py
│── Paper.pdf               # The term paper we submitted
│── Parameters.py           # All parameters used for training/inference
│── README.md
│── requirements.txt
└── SNN.py                  # The file containing all functions for training/infering 

(back to top)

🚫 Limitations

  • No hidden layers implemented
  • Conversions into Spike Trains works only with GreyScale
  • Long training times
  • Didn't use the entire MNIST dataset for training

(back to top)

πŸ“Š Poster

As part of this lecture, we also provided a poster presentation of our results for our fellow students and lecturers.

Group poster


(back to top)

πŸ“ƒ Paper

If you are interested in the exact hyperparameters we used or want to get more details in general, we also uploaded the accompanying term paper, which we wrote for this lecture. Still here are some of our results we achieved:
drawing |drawing
In general our results showed that our implementation of an Spiking Neural Networks got a pretty good classification performance after only one epoch of training. But it didn't improve much beyond that and it was handily beaten by a classical ANN of similar size using Dense layers after a few training epochs. Furthermore the SNNs didn't profit from more Neurons as much as the classical ANNs with Dense Layers did.

(back to top)

πŸ“ Authors

Peter Keffer
Leonie Grafweg
Paula Heupel
Cornelius Wolff

(back to top)

πŸ“Ž License

Copyright 2022 Cornelius Wolff, Paula Heupel, Leonie Grafweg, Peter Keffer

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

(back to top)