RobertCsordas / moe_attention

Official repository for the paper "SwitchHead: Accelerating Transformers with Mixture-of-Experts Attention"
96Updated 4 months ago

Alternatives and similar repositories for moe_attention:

Users that are interested in moe_attention are comparing it to the libraries listed below