uclaml / MoELinks

Towards Understanding the Mixture-of-Experts Layer in Deep Learning
31Updated last year

Alternatives and similar repositories for MoE

Users that are interested in MoE are comparing it to the libraries listed below

Sorting: