neural-network-compression
There are 26 repositories under neural-network-compression topic.
open-edge-platform/anomalib
An anomaly detection library comprising state-of-the-art algorithms and features such as experiment management, hyper-parameter optimization, and edge inference.
csarron/awesome-emdl
Embedded and mobile deep learning research resources
imirzadeh/Teacher-Assistant-Knowledge-Distillation
Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf
VITA-Group/Deep-K-Means-pytorch
[ICML 2018] "Deep k-Means: Re-Training and Parameter Sharing with Harder Cluster Assignments for Compressing Deep Convolutions"
ashutoshbsathe/scarpet-nn
Tools and libraries to run neural networks in Minecraft :pick:
varungohil/Generalizing-Lottery-Tickets
This repository contains code to replicate the experiments given in NeurIPS 2019 paper "One ticket to win them all: generalizing lottery ticket initializations across datasets and optimizers"
daniel-rychlewski/hsi-toolbox
Hyperspectral CNN compression and band selection
VITA-Group/Audio-Lottery
[ICLR 2022] "Audio Lottery: Speech Recognition Made Ultra-Lightweight, Noise-Robust, and Transferable", by Shaojin Ding, Tianlong Chen, Zhangyang Wang
snu-mllab/LayerMerge
Official PyTorch implementation of "LayerMerge: Neural Network Depth Compression through Layer Pruning and Merging" (ICML'24)
diaoenmao/Pruning-Deep-Neural-Networks-from-a-Sparsity-Perspective
[ICLR 2023] Pruning Deep Neural Networks from a Sparsity Perspective
IlanPrice/DCTpS
Code for testing DCT plus Sparse (DCTpS) networks
TaehyeonKim-pyomu/CNN_compression_rank_selection_BayesOpt
Bayesian Optimization-Based Global Optimal Rank Selection for Compression of Convolutional Neural Networks, IEEE Access
snu-mllab/Efficient-CNN-Depth-Compression
Official PyTorch implementation of "Efficient Latency-Aware CNN Depth Compression via Two-Stage Dynamic Programming" (ICML'23)
AnacletoLAB/sHAM
Compact representations of convolutional neural networks via weight pruning and quantization
kkahatapitiya/LinearConv
Code for our WACV 2021 paper "Exploiting the Redundancy in Convolutional Filters for Parameter Reduction"
Krishnkant-Swarnkar/Pytorch-pruning
Implementation of various neural network pruing methods in pytorch.
espn-neurips2020/ESPN
ESPN: Extreme Sparse Pruned Network
ChanganVR/learning-to-learn-sparsity
Use a meta-network to learn the importance and correlation of neural network weights
coinslab/StatPruneNet
Development and Evaluation of Neural Net Sensitivity-Based Pruning Algorithms Using Statistical Inference
d-becking/nncodec-icml-2023-demo
This repository is for reproducing the results shown in the NNCodec ICML Workshop paper. Additionally, it includes a demo, prepared for the Neural Compression Workshop (NCW).
daniel-rychlewski/cnn-planesnet
Compressed CNNs for airplane classification in satellite images (APoZ-based parameter pruning, INT8 weight quantization)
datur/nncompression
Masters Thesis Project https://davidturner94.github.com/nncompression
jaketae/nn-svd
Neural network compression with SVD
MichiganCOG/MINT
Neural Network Pruning Using Dependency Measures
wesamnabeel99/neural-network-compression
Image classification using compressed deep neural network ported on resource-constrained platforms.