thu-nics / MoALinks

The official implementation of the paper <MoA: Mixture of Sparse Attention for Automatic Large Language Model Compression>
129Updated last week

Alternatives and similar repositories for MoA

Users that are interested in MoA are comparing it to the libraries listed below

Sorting: