XueFuzhao / OpenMoE

A family of open-sourced Mixture-of-Experts (MoE) Large Language Models
1,425Updated 10 months ago

Alternatives and similar repositories for OpenMoE:

Users that are interested in OpenMoE are comparing it to the libraries listed below