TUDB-Labs / MixLoRA
State-of-the-art Parameter-Efficient MoE Fine-tuning Method
☆89Updated 2 months ago
Related projects ⓘ
Alternatives and complementary repositories for MixLoRA
- ☆115Updated 3 months ago
- An Efficient LLM Fine-Tuning Factory Optimized for MoE PEFT☆38Updated last month
- [SIGIR'24] The official implementation code of MOELoRA.