google / paxmlLinks
Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimentation and parallelization, and has demonstrated industry leading model flop utilization rates.
☆529Updated last week
Alternatives and similar repositories for paxml
Users that are interested in paxml are comparing it to the libraries listed below
Sorting:
- ☆188Updated last month
- jax-triton contains integrations between JAX and OpenAI Triton☆415Updated 2 months ago
- ☆328Updated this week
- ☆145Updated 3 weeks ago
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆648Updated this week
- JAX-Toolbox☆332Updated this week
- Orbax provides common checkpointing and persistence utilities for JAX users☆419Updated this week
- ☆527Updated last year
- JetStream is a throughput and memory optimized engine for LLM inference on XLA devices, starting with TPUs (and GPUs in future -- PRs wel…☆371Updated 2 months ago
- ☆361Updated last year
- Library for reading and processing ML training data.☆519Updated this week
- seqax = sequence modeling + JAX☆166Updated last month
- ☆275Updated last year
- JAX implementation of the Llama 2 model☆219Updated last year
- JAX Synergistic Memory Inspector☆179Updated last year
- MLCommons Algorithmic Efficiency is a benchmark and competition measuring neural network training speedups due to algorithmic improvement…☆390Updated last week
- Fault tolerance for PyTorch (HSDP, LocalSGD, DiLoCo, Streaming DiLoCo)☆389Updated 2 weeks ago
- Implementation of Flash Attention in Jax☆216Updated last year
- PyTorch Single Controller☆374Updated this week
- Train very large language models in Jax.☆208Updated last year
- Inference code for LLaMA models in JAX☆118Updated last year
- A JAX-native LLM Post-Training Library☆128Updated this week
- TorchX is a universal job launcher for PyTorch applications. TorchX is designed to have fast iteration time for training/research and sup…☆387Updated this week
- CLU lets you write beautiful training loops in JAX.☆355Updated 2 months ago
- Accelerate, Optimize performance with streamlined training and serving options with JAX.☆304Updated this week
- Minimal yet performant LLM examples in pure JAX☆150Updated last week
- ☆233Updated 6 months ago
- Implementation of a Transformer, but completely in Triton☆273Updated 3 years ago
- ☆23Updated 2 weeks ago
- A library to analyze PyTorch traces.☆404Updated last week