XueFuzhao / OpenMoE

A family of open-sourced Mixture-of-Experts (MoE) Large Language Models
1,391Updated 8 months ago

Related projects

Alternatives and complementary repositories for OpenMoE