google / flaxformerLinks
☆358Updated last year
Alternatives and similar repositories for flaxformer
Users that are interested in flaxformer are comparing it to the libraries listed below
Sorting:
- ☆186Updated last month
- Implementation of Flash Attention in Jax☆213Updated last year
- Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimenta…☆513Updated last week
- JAX Synergistic Memory Inspector☆175Updated last year
- Task-based datasets, preprocessing, and evaluation for sequence models.☆583Updated 2 months ago
- ☆256Updated last month
- Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways - in Jax (Equinox framework)☆187Updated 3 years ago
- Train very large language models in Jax.☆204Updated last year
- JAX implementation of the Llama 2 model☆219Updated last year
- ☆67Updated 2 years ago
- Inference code for LLaMA models in JAX☆118Updated last year
- Implementation of a Transformer, but completely in Triton☆270Updated 3 years ago
- Sequence modeling with Mega.☆296Updated 2 years ago
- jax-triton contains integrations between JAX and OpenAI Triton☆405Updated 3 weeks ago
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆607Updated this week
- ☆166Updated 2 years ago
- Swarm training framework using Haiku + JAX + Ray for layer parallel transformer language models on unreliable, heterogeneous nodes☆239Updated 2 years ago
- Named tensors with first-class dimensions for PyTorch☆332Updated 2 years ago
- Language Modeling with the H3 State Space Model☆519Updated last year
- Amos optimizer with JEstimator lib.☆82Updated last year
- ☆61Updated 3 years ago
- ☆303Updated last year
- LoRA for arbitrary JAX models and functions☆140Updated last year
- ☆320Updated 2 weeks ago
- CLU lets you write beautiful training loops in JAX.☆349Updated 3 weeks ago
- A minimal PyTorch Lightning OpenAI GPT w DeepSpeed Training!☆112Updated 2 years ago
- ☆273Updated last year
- JMP is a Mixed Precision library for JAX.☆206Updated 5 months ago
- Experiments around a simple idea for inducing multiple hierarchical predictive model within a GPT☆214Updated 10 months ago
- ☆230Updated 5 months ago