model-distillation
There are 16 repositories under model-distillation topic.
dkozlov/awesome-knowledge-distillation
Awesome Knowledge Distillation
autodistill/autodistill
Images to inference with no labeling (use foundation models to train supervised models).
Hramchenko/diffusion_distiller
🚀 PyTorch Implementation of "Progressive Distillation for Fast Sampling of Diffusion Models(v-diffusion)"
pauljblazek/deepdistilling
Mechanistically interpretable neurosymbolic AI (Nature Comput Sci 2024): losslessly compressing NNs to computer code and discovering new algorithms which generalize out-of-distribution and outperform human-designed algorithms
kaiyuyue/mgd
Matching Guided Distillation (ECCV 2020)
bloomberg/minilmv2.bb
Our open source implementation of MiniLMv2 (https://aclanthology.org/2021.findings-acl.188)
frankaging/Causal-Distill
The Codebase for Causal Distillation for Language Models (NAACL '22)
dataplayer12/darklight
A framework for knowledge distillation using TensorRT inference on teacher network
jzuern/autograph-pub
Repository for the publication "AutoGraph: Predicting Lane Graphs from Traffic"
frankaging/Causal-Distill-XXS
The Codebase for Causal Distillation for Task-Specific Models
chadHGY/awesome-deep-model-compression
Awesome Deep Model Compression
KonstantinosBarmpas/Shallow-Waters
[Master Thesis] Research project at the Data Analytics Lab in collaboration with Daedalean AI. The thesis was submitted to both ETH Zürich and Imperial College London.
autodistill/autodistill-gcp-vision
Autodistill Google Cloud Vision module for use in training a custom, fine-tuned model.
autodistill/autodistill-rekognition
Use AWS Rekognition to train custom models that you own.
capjamesg/autodistill-llama
Use LLaMA to label data for use in training a fine-tuned LLM.
dogeplusplus/seafood-distillation
Model distillation of CNNs for classification of Seafood Images in PyTorch