/direct-preference-optimization

Reference implementation for DPO (Direct Preference Optimization)

Primary LanguageJupyter NotebookApache License 2.0Apache-2.0

Stargazers

No one’s star this repository yet.