MinkaiXu / fPOView on GitHub
f-PO: Generalizing Preference Optimization with f-divergence Minimization
13Apr 2, 2025Updated 11 months ago

Alternatives and similar repositories for fPO

Users that are interested in fPO are comparing it to the libraries listed below

Sorting:

Are these results useful?