XueFuzhao / OpenMoEView on GitHub
A family of open-sourced Mixture-of-Experts (MoE) Large Language Models
1,667Mar 8, 2024Updated 2 years ago

Alternatives and similar repositories for OpenMoE

Users that are interested in OpenMoE are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?