weight-sharing
There are 10 repositories under weight-sharing topic.
mdsunivie/deeperwin
DeepErwin is a python 3.8+ package that implements and optimizes JAX 2.x wave function models for numerical solutions to the multi-electron Schrödinger equation. DeepErwin supports weight-sharing when optimizing wave functions for multiple nuclear geometries and the usage of pre-trained neural network weights to accelerate optimization.
RobertCsordas/modules
The official repository for our paper "Are Neural Nets Modular? Inspecting Functional Modularity Through Differentiable Weight Masks". We develop a method for analyzing emerging functional modularity in neural networks based on differentiable weight masks and use it to point out important issues in current-day neural networks.
jaketae/param-share-transformer
PyTorch implementation of Lessons on Parameter Sharing across Layers in Transformers
toshas/tbasis
T-Basis: a Compact Representation for Neural Networks
cliang1453/CAMERO
CAMERO: Consistency Regularized Ensemble of Perturbed Language Models with Weight Sharing (ACL 2022)
shahrukhx01/joint-learn
A PyTorch based comprehensive toolkit for weight-sharing in text classification setting.
angelocatalani/neural-network-compression
Compressing deep neural networks with pruning, trained quantization and Huffman coding
e-dupuis/retraining-free-weight-sharing
Code for our ASP-DAC 2022 Paper: A Heuristic Exploration to Retraining-free Weight Sharing for CNN Compression
mehravehj/Balanced-Mixture-of-SuperNets
Code Release for "Balanced Mixture of Supernets for Learning CNN Pooling", AutoML2023
okz12/NN_compression
Neural Network Compression