sssth / awesome-DPOView on GitHub
papers related to Direct Preference Optimization(DPO)
19Jul 16, 2024Updated last year

Alternatives and similar repositories for awesome-DPO

Users that are interested in awesome-DPO are comparing it to the libraries listed below

Sorting:

Are these results useful?