Bumble666 / Hyper_MoELinks
☆33Updated 9 months ago
Alternatives and similar repositories for Hyper_MoE
Users that are interested in Hyper_MoE are comparing it to the libraries listed below
Sorting:
- ☆161Updated last year
- ☆54Updated 10 months ago
- code for ACL24 "MELoRA: Mini-Ensemble Low-Rank Adapter for Parameter-Efficient Fine-Tuning"☆32Updated 8 months ago
- MoCLE (First MLLM with MoE for instruction customization and generalization!) (https://arxiv.org/abs/2312.12379)☆44Updated 3 months ago
- Code for ACL 2024 accepted paper titled "SAPT: A Shared Attention Framework for Parameter-Efficient Continual Learning of Large Language …