/ddpo-pytorch

DDPO for finetuning diffusion models, implemented in PyTorch with LoRA support

Primary LanguagePythonMIT LicenseMIT

Stargazers