lucidrains / mixture-of-attentionView on GitHub
Some personal experiments around routing tokens to different autoregressive attention, akin to mixture-of-experts
123Oct 17, 2024Updated last year

Alternatives and similar repositories for mixture-of-attention

Users that are interested in mixture-of-attention are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?