fla-org / flash-bidirectional-linear-attention
Triton implement of bi-directional (non-causal) linear attention
☆46Updated 3 months ago
Alternatives and similar repositories for flash-bidirectional-linear-attention:
Users that are interested in flash-bidirectional-linear-attention are comparing it to the libraries listed below
- [ICLR 2025] Official PyTorch implementation of "Forgetting Transformer: Softmax Attention with a Forget Gate"☆97Updated 3 weeks ago
- A WebUI for Side-by-Side Comparison of Media (Images/Videos) Across Multiple Folders☆23Updated 2 months ago
- ☆39Updated last month
- XAttention: Block Sparse Attention with Antidiagonal Scoring