/ddpo-pytorch

DDPO for finetuning diffusion models, implemented in PyTorch with LoRA support

Primary LanguagePythonMIT LicenseMIT

Watchers

No one’s watching this repository yet.