sahithyaravi/direct-preference-optimization
Reference implementation for DPO (Direct Preference Optimization)
PythonApache-2.0
Watchers
No one’s watching this repository yet.
Reference implementation for DPO (Direct Preference Optimization)
PythonApache-2.0
No one’s watching this repository yet.