okarthikb / DPOView on GitHub
Implementation of Direct Preference Optimization
17Jul 17, 2023Updated 2 years ago

Alternatives and similar repositories for DPO

Users that are interested in DPO are comparing it to the libraries listed below

Sorting:

Are these results useful?