ag1988 / top_k_attention
View external linksLinks

The accompanying code for "Memory-efficient Transformers via Top-k Attention" (Ankit Gupta, Guy Dar, Shaya Goodman, David Ciprut, Jonathan Berant. SustaiNLP 2021).
69Sep 19, 2021Updated 4 years ago

Alternatives and similar repositories for top_k_attention

Users that are interested in top_k_attention are comparing it to the libraries listed below

Sorting:

Are these results useful?