thu-nics / MoALinks

[CoLM'25] The official implementation of the paper <MoA: Mixture of Sparse Attention for Automatic Large Language Model Compression>
141Updated 3 weeks ago

Alternatives and similar repositories for MoA

Users that are interested in MoA are comparing it to the libraries listed below

Sorting: