lucidrains / flash-attention-jaxLinks
Implementation of Flash Attention in Jax
☆215Updated last year
Alternatives and similar repositories for flash-attention-jax
Users that are interested in flash-attention-jax are comparing it to the libraries listed below
Sorting:
- ☆187Updated 2 weeks ago
- Implementation of a Transformer, but completely in Triton☆273Updated 3 years ago
- Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways - in Jax (Equinox framework)☆187Updated 3 years ago
- jax-triton contains integrations between JAX and OpenAI Triton☆411Updated last month
- JMP is a Mixed Precision library for JAX.☆206Updated 6 months ago
- JAX Synergistic Memory Inspector☆177Updated last year
- LoRA for arbitrary JAX models and functions☆140Updated last year
- ☆361Updated last year
- JAX implementation of the Llama 2 model☆219Updated last year
- Inference code for LLaMA models in JAX☆118Updated last year
- A library for unit scaling in PyTorch☆128Updated 3 weeks ago
- Train very large language models in Jax.☆205Updated last year
- Experiment of using Tangent to autodiff triton☆79Updated last year
- ☆323Updated last month
- JAX-Toolbox☆327Updated this week
- ☆113Updated last year
- Accelerated First Order Parallel Associative Scan☆184Updated 11 months ago
- ☆61Updated 3 years ago
- A simple library for scaling up JAX programs☆140Updated 9 months ago
- Memory Efficient Attention (O(sqrt(n)) for Jax and PyTorch☆184Updated 2 years ago
- ☆232Updated 5 months ago
- Machine Learning eXperiment Utilities☆46Updated this week
- ☆137Updated last week
- JAX bindings for Flash Attention v2☆90Updated this week
- This repository contains the experimental PyTorch native float8 training UX☆224Updated last year
- ☆67Updated 2 years ago
- A user-friendly tool chain that enables the seamless execution of ONNX models using JAX as the backend.☆118Updated last week
- Named tensors with first-class dimensions for PyTorch☆332Updated 2 years ago
- Implementation of fused cosine similarity attention in the same style as Flash Attention☆214Updated 2 years ago
- Just some miscellaneous utility functions / decorators / modules related to Pytorch and Accelerate to help speed up implementation of new…☆124Updated last year