lucidrains / flash-attention-jax
Implementation of Flash Attention in Jax
☆204Updated 11 months ago
Alternatives and similar repositories for flash-attention-jax:
Users that are interested in flash-attention-jax are comparing it to the libraries listed below
- Implementation of a Transformer, but completely in Triton☆257Updated 2 years ago
- jax-triton contains integrations between JAX and OpenAI Triton☆379Updated 3 weeks ago
- JMP is a Mixed Precision library for JAX.☆191Updated 3 weeks ago
- ☆182Updated 2 weeks ago
- JAX Synergistic Memory Inspector☆168Updated 7 months ago
- LoRA for arbitrary JAX models and functions☆135Updated 11 months ago
- Implementation of fused cosine similarity attention in the same style as Flash Attention☆210Updated 2 years ago
- Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways - in Jax (Equinox framework)☆186Updated 2 years ago
- ☆284Updated last week
- JAX implementation of the Llama 2 model☆215Updated last year
- ☆338Updated 10 months ago
- A simple library for scaling up JAX programs☆129Updated 3 months ago
- Accelerated First Order Parallel Associative Scan☆171Updated 6 months ago
- Inference code for LLaMA models in JAX☆114Updated 9 months ago
- This repository contains the experimental PyTorch native float8 training UX☆221Updated 6 months ago
- A library for unit scaling in PyTorch☆122Updated 2 months ago
- JAX-Toolbox☆280Updated this week
- Library for reading and processing ML training data.☆386Updated this week
- JAX bindings for Flash Attention v2☆85Updated 7 months ago
- Train very large language models in Jax.☆202Updated last year
- Unofficial JAX implementations of deep learning research papers☆153Updated 2 years ago
- Implementation of a memory efficient multi-head attention as proposed in the paper, "Self-attention Does Not Need O(n²) Memory"☆370Updated last year
- Orbax provides common checkpointing and persistence utilities for JAX users☆339Updated this week
- ☆65Updated 2 years ago
- ☆58Updated 2 years ago
- ☆111Updated 2 weeks ago
- Run PyTorch in JAX. 🤝☆219Updated this week
- A user-friendly tool chain that enables the seamless execution of ONNX models using JAX as the backend.☆107Updated 3 weeks ago
- Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimenta…☆478Updated 2 weeks ago
- CLU lets you write beautiful training loops in JAX.☆333Updated 2 weeks ago