google / torchaxLinks
torchax is a PyTorch frontend for JAX. It gives JAX the ability to author JAX programs using familiar PyTorch syntax. It also provides JAX-Pytorch interoperability, meaning, one can mix JAX & Pytorch syntax together when authoring ML programs, and run it in every hardware JAX can run.
☆171Updated this week
Alternatives and similar repositories for torchax
Users that are interested in torchax are comparing it to the libraries listed below
Sorting:
- Minimal yet performant LLM examples in pure JAX☆233Updated 2 weeks ago
- a Jax quantization library☆87Updated this week
- Tokamax: A GPU and TPU kernel library.☆165Updated last week
- JAX-Toolbox☆381Updated this week
- A simple library for scaling up JAX programs☆144Updated 2 months ago
- Orbax provides common checkpointing and persistence utilities for JAX users☆478Updated this week
- jax-triton contains integrations between JAX and OpenAI Triton☆437Updated last month
- seqax = sequence modeling + JAX☆170Updated 6 months ago
- 🧱 Modula software package☆322Updated 5 months ago
- Minimal, lightweight JAX implementations of popular models.☆180Updated this week
- ☆289Updated last year
- Named Tensors for Legible Deep Learning in JAX☆217Updated 2 months ago
- ☆263Updated this week
- A FlashAttention implementation for JAX with support for efficient document mask computation and context parallelism.☆157Updated 2 months ago
- JAX bindings for Flash Attention v2☆103Updated last month
- OpTree: Optimized PyTree Utilities☆205Updated 3 weeks ago
- Run PyTorch in JAX. 🤝☆311Updated 3 months ago
- ☆344Updated 3 weeks ago
- Library for reading and processing ML training data.☆670Updated last week
- JMP is a Mixed Precision library for JAX.☆211Updated last year
- Small scale distributed training of sequential deep learning models, built on Numpy and MPI.☆154Updated 2 years ago
- A library for unit scaling in PyTorch☆133Updated 6 months ago
- Minimal but scalable implementation of large language models in JAX☆35Updated 2 months ago
- A user-friendly tool chain that enables the seamless execution of ONNX models using JAX as the backend.☆130Updated last month
- Pytorch-like dataloaders for JAX.☆98Updated last month
- Distributed pretraining of large language models (LLMs) on cloud TPU slices, with Jax and Equinox.☆24Updated last year
- Einsum-like high-level array sharding API for JAX☆34Updated last year
- Lightning-like training API for JAX with Flax☆45Updated last year
- Fault tolerance for PyTorch (HSDP, LocalSGD, DiLoCo, Streaming DiLoCo)☆472Updated 2 weeks ago
- ☆300Updated this week