thu-nics / MoAView on GitHub
[CoLM'25] The official implementation of the paper <MoA: Mixture of Sparse Attention for Automatic Large Language Model Compression>
157Jan 14, 2026Updated 2 months ago

Alternatives and similar repositories for MoA

Users that are interested in MoA are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?