/ddpo-pytorch

DDPO for finetuning diffusion models, implemented in PyTorch with LoRA support

Primary LanguagePythonMIT LicenseMIT

Stargazers

No one’s star this repository yet.