pjlab-sys4nlp / llama-moeView on GitHub
⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024)
1,002Dec 6, 2024Updated last year

Alternatives and similar repositories for llama-moe

Users that are interested in llama-moe are comparing it to the libraries listed below

Sorting:

Are these results useful?