Guided sampling, and knowledge distillation questions!
Luke2642 opened this issue · 1 comments
Luke2642 commented
This is great work!
Is there any theoretical basis of how a diffusion model could be adapted or distilled into a pfgm model? If I understand correctly, the diffusion model has already captured the data distribution, so could this theoretically be more efficient than training a PFGM from data? I'm particularly interested in DDIM inversion, null text inversion in diffusion models. PFGM++ seems to excel at inversion!
EDIT - I had various other questions but reading the paper again I think you covered them, in the one sample per condition part.
Luke2642 commented
https://arxiv.org/pdf/2303.01469.pdf
To answer my own question, it's a yes. This paper uses a pre-trained model to solve the ODE.