XueFuzhao / OpenMoEView on GitHub
A family of open-sourced Mixture-of-Experts (MoE) Large Language Models
1,663Mar 8, 2024Updated last year

Alternatives and similar repositories for OpenMoE

Users that are interested in OpenMoE are comparing it to the libraries listed below

Sorting:

Are these results useful?