RobertCsordas / moe_attention
View external linksLinks

Official repository for the paper "SwitchHead: Accelerating Transformers with Mixture-of-Experts Attention"
102Sep 30, 2024Updated last year

Alternatives and similar repositories for moe_attention

Users that are interested in moe_attention are comparing it to the libraries listed below

Sorting:

Are these results useful?