/ddpo-pytorch

DDPO for finetuning diffusion models, implemented in PyTorch with LoRA support

Primary LanguagePythonMIT LicenseMIT

No issues in this repository yet.