tspeterkim / flash-attention-minimal

Flash Attention in ~100 lines of CUDA (forward pass only)
681Updated 2 weeks ago

Alternatives and similar repositories for flash-attention-minimal:

Users that are interested in flash-attention-minimal are comparing it to the libraries listed below