/DPO

Direct Preference Optimization Implementation

Primary LanguagePythonApache License 2.0Apache-2.0

Stargazers