wrmedford / moe-scalingView on GitHub
Scaling Laws for Mixture of Experts Models
15Feb 25, 2025Updated last year

Alternatives and similar repositories for moe-scaling

Users that are interested in moe-scaling are comparing it to the libraries listed below

Sorting:

Are these results useful?