sparse-neural-networks
There are 31 repositories under sparse-neural-networks topic.
dcmocanu/sparse-evolutionary-artificial-neural-networks
Always sparse. Never dense. But never say never. A Sparse Training repository for the Adaptive Sparse Connectivity concept and its algorithmic instantiation, i.e. Sparse Evolutionary Training, to boost Deep Learning scalability on various aspects (e.g. memory and computational time efficiency, representation and generalization power).
nanowell/Q-Sparse-LLM
My Implementation of Q-Sparse: All Large Language Models can be Fully Sparsely-Activated
VITA-Group/SFW-Once-for-All-Pruning
[ICLR 2022] "Learning Pruning-Friendly Networks via Frank-Wolfe: One-Shot, Any-Sparsity, and No Retraining" by Lu Miao*, Xiaolong Luo*, Tianlong Chen, Wuyang Chen, Dong Liu, Zhangyang Wang
VITA-Group/SMC-Bench
[ICLR 2023] "Sparsity May Cry: Let Us Fail (Current) Sparse Neural Networks Together!" Shiwei Liu, Tianlong Chen, Zhenyu Zhang, Xuxi Chen, Tianjin Huang, AJAY KUMAR JAISWAL, Zhangyang Wang
Efficient-Scalable-Machine-Learning/EvNN
Event-based neural networks
zahraatashgahi/QuickSelection
[Machine Learning Journal (ECML-PKDD 2022 journal track)] Quick and Robust Feature Selection: the Strength of Energy-efficient Sparse Training for Autoencoders
IlanPrice/DCTpS
Code for testing DCT plus Sparse (DCTpS) networks
lim142857/Sparsifiner
Demo code for CVPR2023 paper "Sparsifiner: Learning Sparse Instance-Dependent Attention for Efficient Vision Transformers"
bcrafton/ssdfa
Github page for SSDFA
GhadaSokar/Dynamic-Sparse-Training-for-Deep-Reinforcement-Learning
[IJCAI 2022] "Dynamic Sparse Training for Deep Reinforcement Learning" by Ghada Sokar, Elena Mocanu , Decebal Constantin Mocanu, Mykola Pechenizkiy, and Peter Stone.
cambridge-mlg/arch_uncert
Code for "Variational Depth Search in ResNets" (https://arxiv.org/abs/2002.02797)
GhadaSokar/SpaceNet
Implementation for the paper "SpaceNet: Make Free Space For Continual Learning" in PyTorch.
zahraatashgahi/NeuroFS
[TMLR] Supervised Feature Selection with Neuron Evolution in Sparse Neural Networks
A-Klass/torch_topkast
PyTorch Implementation of TopKAST
BICLab/RM-SNN
Offical implementation of "Sparser spiking activity can be better: Feature Refine-and-Mask spiking neural network for event-based visual recognition" (Neural Networks 2023)
VITA-Group/Peek-a-Boo
[ICLR 2022] "Peek-a-Boo: What (More) is Disguised in a Randomly Weighted Neural Network, and How to Find It Efficiently", by Xiaohan Chen, Jason Zhang and Zhangyang Wang.
SurrealVectors/Soevnn
A neural net with a terminal-based testing program.
zahraatashgahi/CTRE
[Machine Learning Journal (ECML-PKDD 2022 journal track)] A Brain-inspired Algorithm for Training Highly Sparse Neural Networks
elch10/SignalNeuroHackTasks
Implementation of artcile "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks"
BrosnanYuen/RayBNN_Neural
Neural Networks with Sparse Weights in Rust using GPUs, CPUs, and FPGAs via CUDA, OpenCL, and oneAPI
zahraatashgahi/PALS
[ECML-PKDD 2024] Adaptive Sparsity Level during Training for Efficient Time Series Forecasting with Transformers
ZIYU-DEEP/Generalization-and-Memorization-in-Sparse-Training
This is the repository for the SNN-22 Workshop paper on "Generalization and Memorization in Sparse Neural Networks".
amikom-gace-research-group/characterize-pruning
Characterization study repository for pruning, a popular way to compress a DL model. this repo also investigates optimal sparse tensor layouts for pruned nets
BrosnanYuen/RayBNN_Sparse
Sparse Matrix Library for GPUs, CPUs, and FPGAs via CUDA, OpenCL, and oneAPI
Efficient-Scalable-Machine-Learning/DeepRewire
Easily create and optimize PyTorch networks as in the Deep Rewiring paper (https://igi-web.tugraz.at/PDF/241.pdf). Install using 'pip install deep_rewire'
guyez/Simple-Sparsely-Connected-NN
Simple C++ implementation of a sparsely connected multi-layer neural network using OpenMP and CUDA for parallelization.
MarwanNour/Sparse-Neural-Networks-GPU
GPU Computing course project
zahraatashgahi/Neuron-Attribution
[ECAI 2024] Unveiling the Power of Sparse Neural Networks for Feature Selection
DanielEftekhari/neural-network-pruning
Neural Network Sparsification via Pruning
neilkichler/robustness_set
Robustness of Sparse Multilayer Perceptrons for Supervised Feature Selection
mxmnburger/structure-of-lottery-tickets
Master's Thesis Project - Lottery Tickets contain independent subnetworks when trained on independent tasks.