google / torchaxLinks
torchax is a PyTorch frontend for JAX. It gives JAX the ability to author JAX programs using familiar PyTorch syntax. It also provides JAX-Pytorch interoperability, meaning, one can mix JAX & Pytorch syntax together when authoring ML programs, and run it in every hardware JAX can run.
☆128Updated last week
Alternatives and similar repositories for torchax
Users that are interested in torchax are comparing it to the libraries listed below
Sorting:
- Minimal yet performant LLM examples in pure JAX☆202Updated 2 months ago
- A simple library for scaling up JAX programs☆144Updated 3 weeks ago
- Named Tensors for Legible Deep Learning in JAX☆212Updated 3 weeks ago
- seqax = sequence modeling + JAX☆168Updated 4 months ago
- Two implementations of ZeRO-1 optimizer sharding in JAX☆14Updated 2 years ago
- Tokamax: A GPU and TPU kernel library.☆116Updated this week
- ☆285Updated last year
- A library for unit scaling in PyTorch☆132Updated 4 months ago
- A FlashAttention implementation for JAX with support for efficient document mask computation and context parallelism.☆149Updated 2 weeks ago
- jax-triton contains integrations between JAX and OpenAI Triton☆436Updated this week
- Distributed pretraining of large language models (LLMs) on cloud TPU slices, with Jax and Equinox.☆24Updated last year
- JAX-Toolbox☆364Updated this week
- Orbax provides common checkpointing and persistence utilities for JAX users☆456Updated this week
- LoRA for arbitrary JAX models and functions☆143Updated last year
- Einsum-like high-level array sharding API for JAX☆34Updated last year
- JAX bindings for Flash Attention v2☆99Updated 3 weeks ago
- Run PyTorch in JAX. 🤝☆307Updated last month
- Minimal but scalable implementation of large language models in JAX☆35Updated 2 months ago
- Experiment of using Tangent to autodiff triton☆80Updated last year
- JMP is a Mixed Precision library for JAX.☆211Updated 9 months ago
- 🧱 Modula software package☆307Updated 3 months ago
- ☆190Updated last week
- Minimal, lightweight JAX implementations of popular models.☆165Updated this week
- a Jax quantization library☆62Updated last week
- Accelerated First Order Parallel Associative Scan☆192Updated last year
- A user-friendly tool chain that enables the seamless execution of ONNX models using JAX as the backend.☆125Updated 2 months ago
- If it quacks like a tensor...☆59Updated last year
- ☆243Updated last week
- OpTree: Optimized PyTree Utilities☆203Updated this week
- Implementation of Flash Attention in Jax☆221Updated last year