OpenSparseLLMs / LLaMA-MoE-v2

πŸš€ LLaMA-MoE v2: Exploring Sparsity of LLaMA from Perspective of Mixture-of-Experts with Post-Training
β˜†80Updated 4 months ago

Alternatives and similar repositories for LLaMA-MoE-v2:

Users that are interested in LLaMA-MoE-v2 are comparing it to the libraries listed below