/direct-preference-optimization

Reference implementation for DPO (Direct Preference Optimization)

Primary LanguagePythonApache License 2.0Apache-2.0

Stargazers