Papers and projects of distillation for diffusion models
-
A Comprehensive Survey on Knowledge Distillation of Diffusion Models. 2023
Weijian Luo. [pdf]
-
Knowledge distillation in iterative generative models for improved sampling speed. 2021.
Eric Luhman, Troy Luhman. [pdf]
-
Progressive Distillation for Fast Sampling of Diffusion Models. ICLR 2022.
Tim Salimans and Jonathan Ho. [pdf]
-
On Distillation of Guided Diffusion Models. CVPR 2023.
Chenlin Meng and Robin Rombach and Ruiqi Gao and Diederik P. Kingma and Stefano Ermon and Jonathan Ho and Tim Salimans. [pdf]
-
TRACT: Denoising Diffusion Models with Transitive Closure Time-Distillation. 2023
Berthelot, David and Autef, Arnaud and Lin, Jierui and Yap, Dian Ang and Zhai, Shuangfei and Hu, Siyuan and Zheng, Daniel and Talbott, Walter and Gu, Eric. [pdf]
-
BK-SDM: Architecturally Compressed Stable Diffusion for Efficient Text-to-Image Generation. ICML 2023.
Kim, Bo-Kyeong and Song, Hyoung-Kyu and Castells, Thibault and Choi, Shinkook. [pdf]
-
On Architectural Compression of Text-to-Image Diffusion Models. 2023.
Kim, Bo-Kyeong and Song, Hyoung-Kyu and Castells, Thibault and Choi, Shinkook. [pdf]
-
Knowledge Diffusion for Distillation. 2023.
Tao Huang, Yuan Zhang, Mingkai Zheng, Shan You, Fei Wang, Chen Qian, Chang Xu [pdf]
-
SnapFusion: Text-to-Image Diffusion Model on Mobile Devices within Two Seconds. 2023.
Yanyu Li, Huan Wang, Qing Jin, Ju Hu, Pavlo Chemerys, Yun Fu, Yanzhi Wang, Sergey Tulyakov, Jian Ren1. [pdf]
-
BOOT: Data-free Distillation of Denoising Diffusion Models with Bootstrapping. 2023.
Jiatao Gu, Shuangfei Zhai, Yizhe Zhang, Lingjie Liu, Josh Susskind. [pdf]
-
Consistency models. 2023
Yang Song, Prafulla Dhariwal, Mark Chen, and Ilya Sutskever. [pdf]