tspeterkim / flash-attention-minimalView on GitHub
Flash Attention in ~100 lines of CUDA (forward pass only)
1,084Dec 30, 2024Updated last year

Alternatives and similar repositories for flash-attention-minimal

Users that are interested in flash-attention-minimal are comparing it to the libraries listed below

Sorting:

Are these results useful?