kyegomez / FlashAttention20

Get down and dirty with FlashAttention2.0 in pytorch, plug in and play no complex CUDA kernels
90Updated last year

Related projects: