uclaml / MoE

Towards Understanding the Mixture-of-Experts Layer in Deep Learning
22Updated last year

Alternatives and similar repositories for MoE:

Users that are interested in MoE are comparing it to the libraries listed below