SuperBruceJia / Awesome-Mixture-of-ExpertsLinks
Awesome Mixture of Experts (MoE): A Curated List of Mixture of Experts (MoE) and Mixture of Multimodal Experts (MoME)
☆37Updated last month
Alternatives and similar repositories for Awesome-Mixture-of-Experts
Users that are interested in Awesome-Mixture-of-Experts are comparing it to the libraries listed below
Sorting:
- ☆120Updated 5 months ago
- ☆113Updated last year
- MoCLE (First MLLM with MoE for instruction customization and generalization!) (https://arxiv.org/abs/2312.12379)☆43Updated 2 months ago
- State-of-the-art Parameter-Efficient MoE Fine-tuning Method☆180Updated last year
- [NeurIPS'24 Oral] HydraLoRA: An Asymmetric LoRA Architecture for Efficient Fine-Tuning☆220Updated 8 months ago
- Survey: A collection of AWESOME papers and resources on the latest research in Mixture of Experts.