sparse-training

There are 14 repositories under sparse-training topic.

  • google-research/rigl

    End-to-end training of sparse deep neural networks with little-to-no performance loss.

    Language:Python316171348
  • dcmocanu/sparse-evolutionary-artificial-neural-networks

    Always sparse. Never dense. But never say never. A Sparse Training repository for the Adaptive Sparse Connectivity concept and its algorithmic instantiation, i.e. Sparse Evolutionary Training, to boost Deep Learning scalability on various aspects (e.g. memory and computational time efficiency, representation and generalization power).

    Language:Python24723263
  • VITA-Group/SViTE

    [NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang, Zhangyang Wang

    Language:Python9015312
  • Shiweiliuiiiiiii/In-Time-Over-Parameterization

    [ICML 2021] "Do We Actually Need Dense Over-Parameterization? In-Time Over-Parameterization in Sparse Training" by Shiwei Liu, Lu Yin, Decebal Constantin Mocanu, Mykola Pechenizkiy

    Language:Python46216
  • VITA-Group/ToST

    [ICML2022] Training Your Sparse Neural Network Better with Any Mask. Ajay Jaiswal, Haoyu Ma, Tianlong Chen, ying Ding, and Zhangyang Wang

    Language:Python271103
  • zahraatashgahi/QuickSelection

    [Machine Learning Journal (ECML-PKDD 2022 journal track)] Quick and Robust Feature Selection: the Strength of Energy-efficient Sparse Training for Autoencoders

    Language:Python17114
  • GhadaSokar/Dynamic-Sparse-Training-for-Deep-Reinforcement-Learning

    [IJCAI 2022] "Dynamic Sparse Training for Deep Reinforcement Learning" by Ghada Sokar, Elena Mocanu , Decebal Constantin Mocanu, Mykola Pechenizkiy, and Peter Stone.

    Language:Python12111
  • IGITUGraz/SparseAdversarialTraining

    Code for "Training Adversarially Robust Sparse Networks via Bayesian Connectivity Sampling" [ICML 2021]

    Language:Python10411
  • Shiweiliuiiiiiii/Selfish-RNN

    [ICML 2021] "Selfish Sparse RNN Training" by Shiwei Liu, Decebal Constantin Mocanu, Yulong Pei, Mykola Pechenizkiy

    Language:Python10133
  • GhadaSokar/SpaceNet

    Implementation for the paper "SpaceNet: Make Free Space For Continual Learning" in PyTorch.

    Language:Python9110
  • zahraatashgahi/NeuroFS

    [TMLR] Supervised Feature Selection with Neuron Evolution in Sparse Neural Networks

    Language:Python7102
  • A-Klass/torch_topkast

    PyTorch Implementation of TopKAST

    Language:Python6270
  • zahraatashgahi/CTRE

    [Machine Learning Journal (ECML-PKDD 2022 journal track)] A Brain-inspired Algorithm for Training Highly Sparse Neural Networks

    Language:Python4111
  • ZIYU-DEEP/Generalization-and-Memorization-in-Sparse-Training

    This is the repository for the SNN-22 Workshop paper on "Generalization and Memorization in Sparse Neural Networks".

    Language:Python2300