A working Python
(≥3.6) interpreter and the pip
package manager. All required libraries and packages can be installed using pip install -r requirements.txt
. To avoid potential package conflicts, the use of a conda
environment is recommended. The following commands can be used to create and activate a separate conda
environment, clone this repository, and to install all dependencies:
conda create -n snn-tha python=3.8
conda activate snn-tha
git clone https://github.com/jeshraghian/snn-tha.git
cd snn-tha
pip install -r requirements.txt
To execute code, cd
into one of four dataset directories, and then run python run.py
.
- In each directory,
conf.py
defines all configuration parameters and hyperparameters for each dataset. The default parameters in this repo are identical to those for the binarized case with bounded homeostasis as reported in the corresponding paper. - To run binarized networks, set
"binarize" : True"
inconf.py
. For optimized parameters, follow the values reported in the paper.
Section 4 of the paper demonstrates the use of bounded homeostasis (using threshold annealing as the warm-up technique) in a spike-timing task. A fully connected network of structure 100-1000-1 is used, where a Poisson spike train is passed at the input, and the output neuron is trained to spike at by linearly ramping up the membrane potential over time using a mean square error loss at each time step:
The animated versions of the above figures are provided below, and can be reproduced in the corresponding notebook.
This is the optimal baseline, showing that it is a reasonably straightforward task to achieve.
spk_time_flt.mp4
The results become significantly unstable when binarizing weights.
spk_time_bin.mp4
A moving average over training iterations is used in an attempt to clean up the above plot, but the results remain senseless:
spk_time_bin_MVA.mp4
Increasing the threshold of all neurons provides a higher dynamic range state-space. But increasing the threshold too high leads to the dead neuron problem. The animation below shows how spiking activity has been suppressed; the flat membrane potential is purely a result of the bias.
spk_time_wthr.mp4
Now apply threshold annealing to use an evolving neuronal state-space to gradually lift spiking activity. This avoids the dead neuron problem in the large threshold case, and avoids the instability/memory leakage in the normalized threshold case.
spk_time_tha.mp4
This now looks far more functional than all previous binarized cases. We can take a moving average to smooth out the impact of sudden reset dynamics. Although not as perfect as the high precision case, the binarized SNN continues to learn despite the excessively high final threshold.