thu-nics / MoA

The official implementation of the paper <MoA: Mixture of Sparse Attention for Automatic Large Language Model Compression>
114Updated 2 months ago

Alternatives and similar repositories for MoA:

Users that are interested in MoA are comparing it to the libraries listed below