Dao-AILab / flash-attention
View external linksLinks

Fast and memory-efficient exact attention
22,231Updated this week

Alternatives and similar repositories for flash-attention

Users that are interested in flash-attention are comparing it to the libraries listed below

Sorting:

Are these results useful?