LINs-lab / DynMoELinks
[ICLR 2025] Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models
☆121Updated 3 weeks ago
Alternatives and similar repositories for DynMoE
Users that are interested in DynMoE are comparing it to the libraries listed below
Sorting:
- [arXiv 2025] Efficient Reasoning Models: A Survey☆247Updated 2 weeks ago
- CoT-Valve: Length-Compressible Chain-of-Thought Tuning☆81Updated 5 months ago
- 🚀 LLaMA-MoE v2: Exploring Sparsity of LLaMA from Perspective of Mixture-of-Experts with Post-Training