Pinned Repositories
diffusers
🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch and FLAX.
Efficient-Deep-Learning
Collection of recent methods on (deep) neural network compression and acceleration.
OKDDip
[AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".
SemCKD
[AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".
SimKD
[CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".
SomePapers
This repository records some papers I have read or to read.
text2image-benchmark
Benchmark for generative image models
Efficient-Deep-Learning
Collection of recent methods on (deep) neural network compression and acceleration.
diff-sampler
An open-source toolbox for fast sampling of diffusion models. Official implementations of our papers published in ICML, CVPR, NeurIPS.
Knowledge-Distillation-Paper
This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).
DefangChen's Repositories
DefangChen/SimKD
[CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".
DefangChen/SemCKD
[AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".
DefangChen/OKDDip
[AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".
DefangChen/SomePapers
This repository records some papers I have read or to read.
DefangChen/text2image-benchmark
Benchmark for generative image models
DefangChen/diffusers
🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch and FLAX.
DefangChen/Efficient-Deep-Learning
Collection of recent methods on (deep) neural network compression and acceleration.