davidmrau / mixture-of-experts

PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538
1,018Updated 8 months ago

Alternatives and similar repositories for mixture-of-experts:

Users that are interested in mixture-of-experts are comparing it to the libraries listed below