google / paxmlLinks
Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimentation and parallelization, and has demonstrated industry leading model flop utilization rates.
☆540Updated last week
Alternatives and similar repositories for paxml
Users that are interested in paxml are comparing it to the libraries listed below
Sorting:
- ☆190Updated last week
- jax-triton contains integrations between JAX and OpenAI Triton☆436Updated this week
- ☆147Updated 3 weeks ago
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆685Updated this week
- JAX-Toolbox☆364Updated this week
- ☆337Updated last week
- Orbax provides common checkpointing and persistence utilities for JAX users☆456Updated this week
- Library for reading and processing ML training data.☆603Updated this week
- JetStream is a throughput and memory optimized engine for LLM inference on XLA devices, starting with TPUs (and GPUs in future -- PRs wel…☆390Updated 5 months ago
- ☆546Updated last year
- ☆363Updated last year
- seqax = sequence modeling + JAX☆168Updated 4 months ago
- ☆285Updated last year
- JAX Synergistic Memory Inspector☆181Updated last year
- TorchX is a universal job launcher for PyTorch applications. TorchX is designed to have fast iteration time for training/research and sup…☆400Updated last week
- CLU lets you write beautiful training loops in JAX.☆358Updated 5 months ago
- Implementation of Flash Attention in Jax☆221Updated last year
- Minimal yet performant LLM examples in pure JAX☆202Updated 2 months ago
- ☆281Updated this week
- Inference code for LLaMA models in JAX☆120Updated last year
- ☆24Updated last week
- Fault tolerance for PyTorch (HSDP, LocalSGD, DiLoCo, Streaming DiLoCo)☆455Updated 2 weeks ago
- Train very large language models in Jax.☆210Updated 2 years ago
- JAX implementation of the Llama 2 model☆216Updated last year
- ☆364Updated 3 weeks ago
- MLCommons Algorithmic Efficiency is a benchmark and competition measuring neural network training speedups due to algorithmic improvement…☆401Updated last week
- JMP is a Mixed Precision library for JAX.☆211Updated 10 months ago
- Accelerate, Optimize performance with streamlined training and serving options with JAX.☆323Updated last week
- ☆234Updated 9 months ago
- Task-based datasets, preprocessing, and evaluation for sequence models.☆589Updated last week