catswe / flash-attention-residualsView on GitHub
Triton kernels and PyTorch ops for Block Attention Residuals (AttnRes)
71May 5, 2026Updated this week

Alternatives and similar repositories for flash-attention-residuals

Users that are interested in flash-attention-residuals are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?