nzjin / awesome_moeLinks
The collections of MOE (Mixture Of Expert) papers, code and tools, etc.
☆12Updated last year
Alternatives and similar repositories for awesome_moe
Users that are interested in awesome_moe are comparing it to the libraries listed below
Sorting:
- [NeurIPS 2024] A Novel Rank-Based Metric for Evaluating Large Language Models☆54Updated 5 months ago
- Official Implementation for EMNLP 2024 (main) "AgentReview: Exploring Academic Peer Review with LLM Agent."☆90Updated 11 months ago
- Parameter-Efficient Fine-Tuning for Foundation Models☆98Updated 7 months ago
- MoCLE (First MLLM with MoE for instruction customization and generalization!) (https://arxiv.org/abs/2312.12379)☆44Updated 4 months ago
- [NAACL 2025] The official implementation of paper "Learning From Failure: Integrating Negative Examples when Fine-tuning Large Language M…☆29Updated last year
- code for ACL24 "MELoRA: Mini-Ensemble Low-Rank Adapter for Parameter-Efficient Fine-Tuning"☆32Updated 8 months ago
- ☆163Updated last year
- [ACL'25] We propose a novel fine-tuning method, Separate Memory and Reasoning, which combines prompt tuning with LoRA.☆79Updated last week
- ☆28Updated 5 months ago
- [ACL 2024] The official codebase for the paper "Self-Distillation Bridges Distribution Gap in Language Model Fine-tuning".☆135Updated last year
- ☆25Updated last year
- ☆130Updated 7 months ago
- Official code for paper "SPA-RL: Reinforcing LLM Agent via Stepwise Progress Attribution"