thu-nics / MoALinks

[CoLM'25] The official implementation of the paper <MoA: Mixture of Sparse Attention for Automatic Large Language Model Compression>
146Updated 2 months ago

Alternatives and similar repositories for MoA

Users that are interested in MoA are comparing it to the libraries listed below

Sorting: