guillaume-chevalier/Spiking-Neural-Network-SNN-with-PyTorch-where-Backpropagation-engenders-STDP
What about coding a Spiking Neural Network using an automatic differentiation framework? In SNNs, there is a time axis and the neural network sees data throughout time, and activation functions are instead spikes that are raised past a certain pre-activation threshold. Pre-activation values constantly fades if neurons aren't excited enough.
Jupyter Notebook
Issues
- 0
- 1
Simple Conceptual Questions
#7 opened by SeanPedersen - 0
Layer-wise Spiking Activity
#6 opened by tehreemnaqvi - 0
- 0
Add BibTeX citation instructions in .ipynb Notebook as well and not just in the README.
#4 opened by guillaume-chevalier - 1
Backpropagation engenders Hebbian learning
#3 opened by clockwiser - 2
To spike or not to spike
#1 opened by RR5555 - 0
average_output flag flipped?
#2 opened by ruslanmustafin