google / paxmlLinks
Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimentation and parallelization, and has demonstrated industry leading model flop utilization rates.
☆536Updated 3 weeks ago
Alternatives and similar repositories for paxml
Users that are interested in paxml are comparing it to the libraries listed below
Sorting:
- ☆187Updated 3 weeks ago
- jax-triton contains integrations between JAX and OpenAI Triton☆423Updated 2 weeks ago
- ☆330Updated last week
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆663Updated this week
- ☆146Updated last month
- JAX-Toolbox☆337Updated this week
- Orbax provides common checkpointing and persistence utilities for JAX users☆426Updated this week
- JetStream is a throughput and memory optimized engine for LLM inference on XLA devices, starting with TPUs (and GPUs in future -- PRs wel…☆377Updated 3 months ago
- Library for reading and processing ML training data.☆539Updated this week
- ☆535Updated last year
- ☆362Updated last year
- seqax = sequence modeling + JAX☆167Updated 2 months ago
- JAX Synergistic Memory Inspector☆179Updated last year
- ☆281Updated last year
- Implementation of Flash Attention in Jax☆217Updated last year
- JAX implementation of the Llama 2 model☆219Updated last year
- Inference code for LLaMA models in JAX☆120Updated last year
- MLCommons Algorithmic Efficiency is a benchmark and competition measuring neural network training speedups due to algorithmic improvement…☆397Updated this week
- Accelerate, Optimize performance with streamlined training and serving options with JAX.☆310Updated last week
- CLU lets you write beautiful training loops in JAX.☆355Updated 2 months ago
- Train very large language models in Jax.☆209Updated last year
- For optimization algorithm research and development.☆538Updated this week
- ☆353Updated this week
- JMP is a Mixed Precision library for JAX.☆208Updated 7 months ago
- A JAX-native LLM Post-Training Library☆144Updated this week
- Swarm training framework using Haiku + JAX + Ray for layer parallel transformer language models on unreliable, heterogeneous nodes☆241Updated 2 years ago
- Everything you want to know about Google Cloud TPU☆546Updated last year
- Minimal yet performant LLM examples in pure JAX☆160Updated this week
- ☆233Updated 7 months ago
- A Jax-based library for building transformers, includes implementations of GPT, Gemma, LlaMa, Mixtral, Whisper, SWin, ViT and more.☆291Updated last year