/snn-tha

Threshold annealing in binarized spiking neural networks

Primary LanguagePython

Bounded Homeostasis in Binarized Spiking Neural Networks

Requirements

A working Python (≥3.6) interpreter and the pip package manager. All required libraries and packages can be installed using pip install -r requirements.txt. To avoid potential package conflicts, the use of a conda environment is recommended. The following commands can be used to create and activate a separate conda environment, clone this repository, and to install all dependencies:

conda create -n snn-tha python=3.8
conda activate snn-tha
git clone https://github.com/jeshraghian/snn-tha.git
cd snn-tha
pip install -r requirements.txt

Code Execution

To execute code, cd into one of four dataset directories, and then run python run.py.

Hyperparameter Tuning

  • In each directory, conf.py defines all configuration parameters and hyperparameters for each dataset. The default parameters in this repo are identical to those for the binarized case with bounded homeostasis as reported in the corresponding paper.
  • To run binarized networks, set "binarize" : True" in conf.py. For optimized parameters, follow the values reported in the paper.

Temporal Coding

Section 4 of the paper demonstrates the use of bounded homeostasis (using threshold annealing as the warm-up technique) in a spike-timing task. A fully connected network of structure 100-1000-1 is used, where a Poisson spike train is passed at the input, and the output neuron is trained to spike at by linearly ramping up the membrane potential over time using a mean square error loss at each time step:

The animated versions of the above figures are provided below, and can be reproduced in the corresponding notebook.

Animations

High Precision Weights, Normalized Threshold

This is the optimal baseline, showing that it is a reasonably straightforward task to achieve.

spk_time_flt.mp4

Binarized Weights, Normalized Threshold

The results become significantly unstable when binarizing weights.

spk_time_bin.mp4

A moving average over training iterations is used in an attempt to clean up the above plot, but the results remain senseless:

spk_time_bin_MVA.mp4

Binarized Weights, Large Threshold

Increasing the threshold of all neurons provides a higher dynamic range state-space. But increasing the threshold too high leads to the dead neuron problem. The animation below shows how spiking activity has been suppressed; the flat membrane potential is purely a result of the bias.

spk_time_wthr.mp4

Binarized Weights, Threshold Annealing

Now apply threshold annealing to use an evolving neuronal state-space to gradually lift spiking activity. This avoids the dead neuron problem in the large threshold case, and avoids the instability/memory leakage in the normalized threshold case.

spk_time_tha.mp4

This now looks far more functional than all previous binarized cases. We can take a moving average to smooth out the impact of sudden reset dynamics. Although not as perfect as the high precision case, the binarized SNN continues to learn despite the excessively high final threshold.

spk_time_tha_MVA.mp4