SparseProp: Efficient Event-Based Simulation and Training of Sparse Recurrent Spiking Neural Networks
This repository contains the implementation code for the manuscript:
SparseProp: Efficient Event-Based Simulation and Training of Sparse Recurrent Spiking Neural Networks
In this work we propose the novel event-based algorithm SparseProp to simulate and train spiking neural networks, reducing computational cost from N to log(N) per network spike for sparse spiking networks. We provide example implementations for recurrent networks of leaky integrate-and-fire neurons and quadratic integrate-and-fire neurons and extend the algorithm to neuron models that lack an analytical solution for the next spike time using Chebyshev polynomials.
- Download Julia
- Julia (>= 1.5, tested on 1.9.1)
- DataStructures, RandomNumbers, PyPlot, ApproxFun, DataInterpolations, DifferentialEquations
To install the required packages, run the following in the Julia REPL after installing Julia:
using Pkg
for pkg in ["RandomNumbers", "PyPlot", "DataStructures", "ApproxFun", "DataInterpolations", "DifferentialEquations"]
Pkg.add(pkg)
end
For example, to run a spiking network of 10^5 leaky integrate-and-fire neurons
include("LIF_SparseProp.jl")
Contains example implementation of a LIF network with SparseProp.
The function lifnet has input parameters
n: # of neurons
k: synapses per neuron
j0: synaptic. strength
τ: membrane time constant
seedic: seed of random number generator for the initial condition.
seednet: seed of random number generator for network realization.
Contains example implementation of a QIF network with SparseProp.
The function qifnet has the same input parameters as lifnet above.
Contains example implementation of a QIF network with SparseProp.
Here, instead of the phase representation, we use the time-based heap.
Contains example implementation of a QIF network with SparseProp.
Here, instead of the phase representation, we use the time-based heap and every neuron receives a different input current.
Contains example implementation of an EIF network with SparseProp.
The next spike time is found using a precalculated lookup table. To create the lookup table, we solve the ordinary differential equation of the exponential integrate-and-fire model with high precision using the DifferentialEquations.jl package. This solution is then transformed into a lookup table using DataInterpolations.jl, based on which the phase transition is calculated with high precision.
Same as EIF_SparseProp.jl, but the phase transition curve is approximated by Chebyshev polynomials. This requires the packages ApproxFun.jl, DataInterpolations.jl, and DifferentialEquations.jl.
To confirm the exactness of our numerical approach to infer the next spike time using lookup tables and a numerical solution of the single neuron dynamics, we compare the results of the LIF network with the analytical phase transition curve with the numerically approximated phase transition curve obtained via DifferentialEquations.jl and DataInterpolations.jl. This requires the packages ApproxFun.jl, DataInterpolations.jl, and DifferentialEquations.jl. The following figures show that the errors in the spike times when running a network simulation of 10^5 neurons is close to machine precision. The errors are so small that no spike index has changed.
A full specification of packages used and their versions can be found in packages.txt .
For all calculations, a 'burn-in' period was discarded to let the network state converge to a stationary state.
All simulations were run on a single CPU and took on the order of minutes to a few hours.