distillation
There are 172 repositories under distillation topic.
IntelLabs/distiller
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
dkozlov/awesome-knowledge-distillation
Awesome Knowledge Distillation
FLHonker/Awesome-Knowledge-Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
AberHu/Knowledge-Distillation-Zoo
Pytorch implementation of various Knowledge Distillation (KD) methods.
airaria/TextBrewer
A PyTorch-based knowledge distillation toolkit for natural language processing
GMvandeVen/continual-learning
PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR, BI-R, ER, A-GEM, iCaRL, Generative Classifier) in three different scenarios.
PaddlePaddle/PaddleSlim
PaddleSlim is an open-source library for deep model compression and architecture search.
ViTAE-Transformer/ViTPose
The official repo for [NeurIPS'22] "ViTPose: Simple Vision Transformer Baselines for Human Pose Estimation" and [TPAMI'23] "ViTPose++: Vision Transformer for Generic Body Pose Estimation"
tangxyw/RecSysPapers
推荐/广告/搜索领域工业界经典以及最前沿论文集合。A collection of industry classics and cutting-edge papers in the field of recommendation/advertising/search.
Syencil/mobile-yolov5-pruning-distillation
mobilev2-yolov5s剪枝、蒸馏,支持ncnn,tensorRT部署。ultra-light but better performence!
CLUEbenchmark/CLUEPretrainedModels
高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
szq0214/MEAL-V2
MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks. In NeurIPS 2020 workshop.
Tanuki/tanuki.py
Prompt engineering for developers
segmind/distill-sd
Segmind Distilled diffusion
thu-ml/ares
A Python library for adversarial machine learning focusing on benchmarking adversarial robustness.
anarchy-ai/LLM-VM
irresponsible innovation. Try now at https://chat.dev/
gojasper/flash-diffusion
Official implementation of ⚡ Flash Diffusion ⚡: Accelerating Any Conditional Diffusion Model for Few Steps Image Generation
Zhen-Dong/HAWQ
Quantization library for PyTorch. Support low-precision and mixed-precision quantization, with hardware implementation through TVM.
huggingface/optimum-intel
🤗 Optimum Intel: Accelerate inference with Intel optimization tools
dotchen/LAV
(CVPR 2022) A minimalist, mapless, end-to-end self-driving stack for joint perception, prediction, planning and control.
qiangsiwei/bert_distill
BERT distillation(基于BERT的蒸馏实验 )
gyunggyung/AGI-Papers
Papers and Book to look at when starting AGI 📚
Nota-NetsPresso/BK-SDM
A Compressed Stable Diffusion for Efficient Text-to-Image Generation [ECCV'24]
leondgarse/Keras_insightface
Insightface Keras implementation
GMvandeVen/brain-inspired-replay
A brain-inspired version of generative replay for continual learning with deep neural networks (e.g., class-incremental learning on CIFAR-100; PyTorch code).
sinanuozdemir/quick-start-guide-to-llms
The Official Repo for "Quick Start Guide to Large Language Models"
HoyTta0/KnowledgeDistillation
Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。
Sharpiless/Yolov5-distillation-train-inference
Yolov5 distillation training | Yolov5知识蒸馏训练,支持训练自己的数据
snap-research/R2L
[ECCV 2022] R2L: Distilling Neural Radiance Field to Neural Light Field for Efficient Novel View Synthesis
monologg/DistilKoBERT
Distillation of KoBERT from SKTBrain (Lightweight KoBERT)
szq0214/FKD
Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"
BioSTEAMDevelopmentGroup/biosteam
The Biorefinery Simulation and Techno-Economic Analysis Modules; Life Cycle Assessment; Chemical Process Simulation Under Uncertainty
dotchen/WorldOnRails
(ICCV 2021, Oral) RL and distillation in CARLA using a factorized world model
Efficient-ML/Awesome-Efficient-LLM-Diffusion
A list of papers, docs, codes about efficient AIGC. This repo is aimed to provide the info for efficient AIGC research, including language and vision, we are continuously improving the project. Welcome to PR the works (papers, repositories) that are missed by the repo.
fxmeng/filter-grafting
Filter Grafting for Deep Neural Networks(CVPR 2020)
as791/Adversarial-Example-Attack-and-Defense
This repository contains the implementation of three adversarial example attack methods FGSM, IFGSM, MI-FGSM and one Distillation as defense against all attacks using MNIST dataset.