pjlab-sys4nlp / llama-moe

⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024)
908Updated last month

Alternatives and similar repositories for llama-moe:

Users that are interested in llama-moe are comparing it to the libraries listed below