google / paxmlLinks
Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimentation and parallelization, and has demonstrated industry leading model flop utilization rates.
☆513Updated last week
Alternatives and similar repositories for paxml
Users that are interested in paxml are comparing it to the libraries listed below
Sorting:
- ☆186Updated last month
- jax-triton contains integrations between JAX and OpenAI Triton☆405Updated 3 weeks ago
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆607Updated this week
- ☆142Updated last week
- Orbax provides common checkpointing and persistence utilities for JAX users☆404Updated this week
- JAX-Toolbox☆321Updated this week
- ☆320Updated 2 weeks ago
- ☆358Updated last year
- ☆511Updated last year
- Library for reading and processing ML training data.☆474Updated this week
- ☆273Updated last year
- JetStream is a throughput and memory optimized engine for LLM inference on XLA devices, starting with TPUs (and GPUs in future -- PRs wel…☆354Updated last month
- JAX implementation of the Llama 2 model☆219Updated last year
- JAX Synergistic Memory Inspector☆175Updated last year
- seqax = sequence modeling + JAX☆165Updated last month
- Implementation of Flash Attention in Jax☆213Updated last year
- CLU lets you write beautiful training loops in JAX.☆349Updated 3 weeks ago
- A stand-alone implementation of several NumPy dtype extensions used in machine learning.☆280Updated this week
- Train very large language models in Jax.☆204Updated last year
- ☆350Updated last week
- MLCommons Algorithmic Efficiency is a benchmark and competition measuring neural network training speedups due to algorithmic improvement…☆388Updated this week
- Swarm training framework using Haiku + JAX + Ray for layer parallel transformer language models on unreliable, heterogeneous nodes☆239Updated 2 years ago
- Home for "How To Scale Your Model", a short blog-style textbook about scaling LLMs on TPUs☆424Updated last week
- ☆230Updated 5 months ago
- Inference code for LLaMA models in JAX☆118Updated last year
- PyTorch Single Controller☆318Updated this week
- JMP is a Mixed Precision library for JAX.☆206Updated 5 months ago
- TorchX is a universal job launcher for PyTorch applications. TorchX is designed to have fast iteration time for training/research and sup…☆370Updated 2 weeks ago
- Everything you want to know about Google Cloud TPU☆532Updated last year
- For optimization algorithm research and development.☆521Updated this week