google / torchaxLinks
torchax is a PyTorch frontend for JAX. It gives JAX the ability to author JAX programs using familiar PyTorch syntax. It also provides JAX-Pytorch interoperability, meaning, one can mix JAX & Pytorch syntax together when authoring ML programs, and run it in every hardware JAX can run.
☆162Updated this week
Alternatives and similar repositories for torchax
Users that are interested in torchax are comparing it to the libraries listed below
Sorting:
- Minimal yet performant LLM examples in pure JAX☆225Updated last week
- Tokamax: A GPU and TPU kernel library.☆149Updated 2 weeks ago
- A simple library for scaling up JAX programs☆144Updated 2 months ago
- a Jax quantization library☆83Updated this week
- JAX-Toolbox☆373Updated this week
- jax-triton contains integrations between JAX and OpenAI Triton☆436Updated 3 weeks ago
- Orbax provides common checkpointing and persistence utilities for JAX users☆474Updated this week
- seqax = sequence modeling + JAX☆169Updated 5 months ago
- JAX bindings for Flash Attention v2☆102Updated last week
- Named Tensors for Legible Deep Learning in JAX☆215Updated 2 months ago
- Minimal but scalable implementation of large language models in JAX☆35Updated last month
- Run PyTorch in JAX. 🤝☆309Updated 2 months ago
- A FlashAttention implementation for JAX with support for efficient document mask computation and context parallelism.☆155Updated last month
- A user-friendly tool chain that enables the seamless execution of ONNX models using JAX as the backend.☆127Updated 3 weeks ago
- OpTree: Optimized PyTree Utilities☆205Updated last week
- ☆341Updated this week
- A library for unit scaling in PyTorch☆133Updated 5 months ago
- 🧱 Modula software package☆322Updated 4 months ago
- Minimal, lightweight JAX implementations of popular models.☆175Updated this week
- A stand-alone implementation of several NumPy dtype extensions used in machine learning.☆323Updated this week
- JMP is a Mixed Precision library for JAX.☆210Updated 11 months ago
- ☆261Updated 2 weeks ago
- Distributed pretraining of large language models (LLMs) on cloud TPU slices, with Jax and Equinox.☆24Updated last year
- JaxPP is a library for JAX that enables flexible MPMD pipeline parallelism for large-scale LLM training☆61Updated 3 weeks ago
- ☆287Updated last year
- Small scale distributed training of sequential deep learning models, built on Numpy and MPI.☆154Updated 2 years ago
- Two implementations of ZeRO-1 optimizer sharding in JAX☆14Updated 2 years ago
- Implementation of Flash Attention in Jax☆223Updated last year
- Experiment of using Tangent to autodiff triton☆81Updated last year
- Dion optimizer algorithm☆413Updated this week