davidmrau / mixture-of-expertsView on GitHub
PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538
1,232Apr 19, 2024Updated last year

Alternatives and similar repositories for mixture-of-experts

Users that are interested in mixture-of-experts are comparing it to the libraries listed below

Sorting:

Are these results useful?