OpenSparseLLMs / LLaMA-MoE-v2View on GitHub
πŸš€ LLaMA-MoE v2: Exploring Sparsity of LLaMA from Perspective of Mixture-of-Experts with Post-Training
β˜†91Dec 3, 2024Updated last year

Alternatives and similar repositories for LLaMA-MoE-v2

Users that are interested in LLaMA-MoE-v2 are comparing it to the libraries listed below

Sorting:

Are these results useful?