pjlab-sys4nlp / llama-moeView on GitHub
⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024)
1,000Dec 6, 2024Updated last year

Alternatives and similar repositories for llama-moe

Users that are interested in llama-moe are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?