JAX bindings for Flash Attention v2
☆104Feb 28, 2026Updated last month
Alternatives and similar repositories for flash_attn_jax
Users that are interested in flash_attn_jax are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- A simple, easy-to-understand library for diffusion models using Flax and Jax. Includes detailed notebooks on DDPM, DDIM, and EDM with sim…☆41Mar 7, 2026Updated 3 weeks ago
- Tensor Parallelism with JAX + Shard Map☆11Sep 29, 2023Updated 2 years ago
- JMP is a Mixed Precision library for JAX.☆212Jan 30, 2025Updated last year
- A FlashAttention implementation for JAX with support for efficient document mask computation and context parallelism.☆160Nov 11, 2025Updated 4 months ago
- ☆350Updated this week
- Proton VPN Special Offer - Get 70% off • AdSpecial partner offer. Trusted by over 100 million users worldwide. Tested, Approved and Recommended by Experts.
- 更纯粹、更高压缩率的Tokenizer in Rust☆13Dec 21, 2024Updated last year
- supporting pytorch FSDP for optimizers☆84Dec 8, 2024Updated last year
- A port of muP to JAX/Haiku☆25Oct 23, 2022Updated 3 years ago
- jax-triton contains integrations between JAX and OpenAI Triton☆442Mar 13, 2026Updated 2 weeks ago
- An implementation of the Llama architecture, to instruct and delight☆21May 31, 2025Updated 9 months ago
- ☆23Jun 18, 2024Updated last year
- Unofficial but Efficient Implementation of "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" in JAX☆93Jan 25, 2024Updated 2 years ago
- A simple library for scaling up JAX programs☆146Nov 4, 2025Updated 4 months ago
- Parallel Associative Scan for Language Models☆18Jan 8, 2024Updated 2 years ago
- Wordpress hosting with auto-scaling on Cloudways • AdFully Managed hosting built for WordPress-powered businesses that need reliable, auto-scalable hosting. Cloudways SafeUpdates now available.
- ☆35Nov 22, 2024Updated last year
- ☆29Jul 9, 2024Updated last year
- PyTorch half precision gemm lib w/ fused optional bias + optional relu/gelu☆78Dec 3, 2024Updated last year
- Code for the paper "Stack Attention: Improving the Ability of Transformers to Model Hierarchical Patterns"☆18Mar 15, 2024Updated 2 years ago
- ☆18Aug 24, 2024Updated last year
- A JAX nn library☆21Sep 9, 2025Updated 6 months ago
- JAX/Flax implementation of the Hyena Hierarchy☆34Apr 27, 2023Updated 2 years ago
- Ring attention implementation with flash attention☆998Sep 10, 2025Updated 6 months ago
- AGaLiTe: Approximate Gated Linear Transformers for Online Reinforcement Learning (Published in TMLR)☆23Oct 15, 2024Updated last year
- End-to-end encrypted email - Proton Mail • AdSpecial offer: 40% Off Yearly / 80% Off First Month. All Proton services are open source and independently audited for security.
- Official Repository for Efficient Linear-Time Attention Transformers.☆18Jun 2, 2024Updated last year
- ☆43Oct 15, 2025Updated 5 months ago
- Tree Attention: Topology-aware Decoding for Long-Context Attention on GPU clusters☆133Dec 3, 2024Updated last year
- Source-to-Source Debuggable Derivatives in Pure Python☆15Jan 23, 2024Updated 2 years ago
- Einsum-like high-level array sharding API for JAX☆34Jul 16, 2024Updated last year
- ☆571Jul 11, 2024Updated last year
- Parallelizing non-linear sequential models over the sequence length☆56Jun 23, 2025Updated 9 months ago
- Machine Learning eXperiment Utilities☆48Jul 29, 2025Updated 8 months ago
- ☆12Jan 4, 2024Updated 2 years ago
- Wordpress hosting with auto-scaling on Cloudways • AdFully Managed hosting built for WordPress-powered businesses that need reliable, auto-scalable hosting. Cloudways SafeUpdates now available.
- Stochastic trace estimation using JAX☆17Aug 20, 2025Updated 7 months ago
- Minimal (400 LOC) implementation Maximum (multi-node, FSDP) GPT training☆132Apr 17, 2024Updated last year
- ☆14Dec 21, 2025Updated 3 months ago
- A flexible and efficient implementation of Flash Attention 2.0 for JAX, supporting multiple backends (GPU/TPU/CPU) and platforms (Triton/…☆34Mar 4, 2025Updated last year
- JAX Synergistic Memory Inspector☆185Jul 16, 2024Updated last year
- Implement Flash Attention using Cute.☆103Dec 17, 2024Updated last year
- Library for reading and processing ML training data.☆703Updated this week