sparse-training
There are 14 repositories under sparse-training topic.
google-research/rigl
End-to-end training of sparse deep neural networks with little-to-no performance loss.
dcmocanu/sparse-evolutionary-artificial-neural-networks
Always sparse. Never dense. But never say never. A Sparse Training repository for the Adaptive Sparse Connectivity concept and its algorithmic instantiation, i.e. Sparse Evolutionary Training, to boost Deep Learning scalability on various aspects (e.g. memory and computational time efficiency, representation and generalization power).
VITA-Group/SViTE
[NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang, Zhangyang Wang
Shiweiliuiiiiiii/In-Time-Over-Parameterization
[ICML 2021] "Do We Actually Need Dense Over-Parameterization? In-Time Over-Parameterization in Sparse Training" by Shiwei Liu, Lu Yin, Decebal Constantin Mocanu, Mykola Pechenizkiy
VITA-Group/ToST
[ICML2022] Training Your Sparse Neural Network Better with Any Mask. Ajay Jaiswal, Haoyu Ma, Tianlong Chen, ying Ding, and Zhangyang Wang
zahraatashgahi/QuickSelection
[Machine Learning Journal (ECML-PKDD 2022 journal track)] Quick and Robust Feature Selection: the Strength of Energy-efficient Sparse Training for Autoencoders
GhadaSokar/Dynamic-Sparse-Training-for-Deep-Reinforcement-Learning
[IJCAI 2022] "Dynamic Sparse Training for Deep Reinforcement Learning" by Ghada Sokar, Elena Mocanu , Decebal Constantin Mocanu, Mykola Pechenizkiy, and Peter Stone.
IGITUGraz/SparseAdversarialTraining
Code for "Training Adversarially Robust Sparse Networks via Bayesian Connectivity Sampling" [ICML 2021]
Shiweiliuiiiiiii/Selfish-RNN
[ICML 2021] "Selfish Sparse RNN Training" by Shiwei Liu, Decebal Constantin Mocanu, Yulong Pei, Mykola Pechenizkiy
GhadaSokar/SpaceNet
Implementation for the paper "SpaceNet: Make Free Space For Continual Learning" in PyTorch.
zahraatashgahi/NeuroFS
[TMLR] Supervised Feature Selection with Neuron Evolution in Sparse Neural Networks
A-Klass/torch_topkast
PyTorch Implementation of TopKAST
zahraatashgahi/CTRE
[Machine Learning Journal (ECML-PKDD 2022 journal track)] A Brain-inspired Algorithm for Training Highly Sparse Neural Networks
ZIYU-DEEP/Generalization-and-Memorization-in-Sparse-Training
This is the repository for the SNN-22 Workshop paper on "Generalization and Memorization in Sparse Neural Networks".