/SNN-learning-methods-papers

该仓库主要收录了顶会中SNN领域的一些论文,论文内容与代理梯度和ANN-SNN转换相关。

SNN-learning-methods-papers

该仓库主要收录了顶会中SNN领域的一些论文,论文内容与代理梯度(sg)和ANN-SNN转换相关。

目录

入门

  • 脉冲神经网络研究进展综述 (WuYujie) Link

    describe: 目前最好的SNN中文综述,入门SNN看这一篇足以。

进阶

must-read sg papers

  • Surrogate Gradient Learning in Spiking Neural Networks. (Mostafa) Link

    describe: Rate-based sg. SNNs like RNNs

  • Spatio-temporal backpropagation for training high-performance spiking neural networks. (WuYujie) Link

    describe: Rate-based sg. 利用BPTT训练SNN

must-read ann-snn works

  • Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. (Rueckauer) Link

    describe: Rate-based conversion.

  • Fast-Classifying, High-Accuracy Spiking Deep Networks Through Weight and Threshold Balancing. (LiuShichi) Link

    describe: Rate-based conversion.

ICML

2020

  • Rmp-snn: Residual membrane potential neuron for enabling deeper high-accuracy and low-latency spiking neural network. (HanBing) Link

    describe: Rate-based conversion. 对soft-reset分析透彻。利用soft-reset第一次将conversion做在大规模网络上。

2021

  • A free lunch from ANN: Towards efficient, accurate spiking neural networks calibration. (Gushi) Link

    describe: Rate-based conversion. 该论文在ICLR2021工作的基础上,加入了逐层缩小误差的方法。效果非常好

ICLR

2021

  • Optimal conversion of conventional artificial neural networks to spiking neural networks. (Gushi) Link

    describe: Rate-based conversion. 该论文系统分析了转换的flooring error。conversation必读论文。

2022

  • Temporal Efficient Training of Spiking Neural Network via Gradient Re-weighting. (Gushi) Link

    describe: Rate-based SG. 该论文对loss的计算进行了调整。

  • Optimal ANN-SNN Conversion for High-accuracy and Ultra-low-latency Spiking Neural Networks. (YuZhaofei) Link

    describe: Rate-based conversion. 该论文系统分析了三种转换误差,提出将QNN转SNN。推荐读。

NeurIPS

2021

  • Differentiable Spike: Rethinking Gradient-Descent for Training Spiking Neural Networks. (Gushi) Link

    describe: Rate-based SG.

  • Deep Residual Learning in Spiking Neural Networks. (YuZhaofei) Link

    describe: Rate-based Method. 针对SNN使用ResNet结构时无法实现identity mapping的问题,将IF神经元放入block中,实现identity mapping。

IJCAI

2021

  • Optimal ann-snn conversion for fast and accurate inference in deep spiking neural networks. (YuZhaofei) Link

    describe: Rate-based conversion. 通过BP的方式得到最优的正则化参数。

AAAI

2019

  • Direct Training for Spiking Neural Networks: Faster, Larger, Better. (WuYujie) Link

    describe: Rate-based SG. 提出NeuronNorm。是STBP的后续工作,推荐读。

2021

  • Going Deeper With Directly-Trained Larger Spiking Neural Networks. (WuYujie) Link

    describe: Rate-based SG. 提出tdBN。是STBP的后续工作,推荐读。

2022

  • Optimized Potential Initialization for Low-latency Spiking Neural Networks. (YuZhaofei) Link

    describe: Rate-based conversion. 该论文提出将神经元初始膜电压设为二分之阈值来缩小flooring error,简单好用。

ICCV

2021

  • Incorporating Learnable Membrane Time Constant to Enhance Learning of Spiking Neural Networks. (YuZhaofei) Link

    describe: Rate-based SG. 对膜电压时间常数采用基于梯度的方式优化。