/DPO-IPO

Self Reference implementation for DPO (Direct Preference Optimization) and IPO

Primary LanguagePythonApache License 2.0Apache-2.0

No issues in this repository yet.