Dao-AILab / flash-attention

Fast and memory-efficient exact attention
β˜†15,064Updated this week

Alternatives and similar repositories for flash-attention:

Users that are interested in flash-attention are comparing it to the libraries listed below