lucidrains / flash-attention-jaxLinks
Implementation of Flash Attention in Jax
☆218Updated last year
Alternatives and similar repositories for flash-attention-jax
Users that are interested in flash-attention-jax are comparing it to the libraries listed below
Sorting:
- ☆188Updated last week
- Implementation of a Transformer, but completely in Triton☆275Updated 3 years ago
- jax-triton contains integrations between JAX and OpenAI Triton☆424Updated 3 weeks ago
- JAX Synergistic Memory Inspector☆180Updated last year
- ☆362Updated last year
- Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways - in Jax (Equinox framework)☆189Updated 3 years ago
- JMP is a Mixed Precision library for JAX.☆208Updated 8 months ago
- JAX implementation of the Llama 2 model☆218Updated last year
- LoRA for arbitrary JAX models and functions☆142Updated last year
- ☆331Updated 3 weeks ago
- A library for unit scaling in PyTorch☆130Updated 2 months ago
- JAX-Toolbox☆343Updated this week
- Memory Efficient Attention (O(sqrt(n)) for Jax and PyTorch☆183Updated 2 years ago
- Experiment of using Tangent to autodiff triton☆81Updated last year
- JAX bindings for Flash Attention v2☆92Updated 3 weeks ago
- ☆62Updated 3 years ago
- A simple library for scaling up JAX programs☆143Updated 11 months ago
- Train very large language models in Jax.☆209Updated last year
- ☆122Updated last year
- Accelerated First Order Parallel Associative Scan☆188Updated last year
- ☆233Updated 7 months ago
- Inference code for LLaMA models in JAX☆120Updated last year
- Machine Learning eXperiment Utilities☆47Updated 2 months ago
- Unofficial JAX implementations of deep learning research papers☆156Updated 3 years ago
- ☆89Updated last year
- This is a port of Mistral-7B model in JAX☆32Updated last year
- This repository contains the experimental PyTorch native float8 training UX☆224Updated last year
- Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimenta…☆536Updated last month
- Minimal yet performant LLM examples in pure JAX☆177Updated last week
- Implementation of fused cosine similarity attention in the same style as Flash Attention☆216Updated 2 years ago