Survey: A collection of AWESOME papers and resources on the latest research in Mixture of Experts.
☆140Aug 21, 2024Updated last year
Alternatives and similar repositories for Awesome-Mixture-of-Experts-Papers
Users that are interested in Awesome-Mixture-of-Experts-Papers are comparing it to the libraries listed below
Sorting:
- [TKDE'25] The official GitHub page for the survey paper "A Survey on Mixture of Experts in Large Language Models".☆482Jul 23, 2025Updated 7 months ago
- A curated reading list of research in Mixture-of-Experts(MoE).☆661Oct 30, 2024Updated last year
- Code for the preprint "Cache Me If You Can: How Many KVs Do You Need for Effective Long-Context LMs?"☆48Jul 29, 2025Updated 7 months ago