fla-org / flash-linear-attentionLinks

πŸš€ Efficient implementations of state-of-the-art linear attention models in Torch and Triton
β˜†2,438Updated this week

Alternatives and similar repositories for flash-linear-attention

Users that are interested in flash-linear-attention are comparing it to the libraries listed below

Sorting: