SkyworkAI / MoE-plus-plus

[ICLR 2025] MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation Experts
186Updated 4 months ago

Alternatives and similar repositories for MoE-plus-plus:

Users that are interested in MoE-plus-plus are comparing it to the libraries listed below