davidmrau / mixture-of-expertsView on GitHub
PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538
1,243Apr 19, 2024Updated 2 years ago

Alternatives and similar repositories for mixture-of-experts

Users that are interested in mixture-of-experts are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?