inclusionAI / MoBEView on GitHub
Mixture-of-Basis-Experts for Compressing MoE-based LLMs
32Dec 24, 2025Updated 3 months ago

Alternatives and similar repositories for MoBE

Users that are interested in MoBE are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?