google / torchaxLinks
torchax is a PyTorch frontend for JAX. It gives JAX the ability to author JAX programs using familiar PyTorch syntax. It also provides JAX-Pytorch interoperability, meaning, one can mix JAX & Pytorch syntax together when authoring ML programs, and run it in every hardware JAX can run.
☆148Updated 2 weeks ago
Alternatives and similar repositories for torchax
Users that are interested in torchax are comparing it to the libraries listed below
Sorting:
- Minimal yet performant LLM examples in pure JAX☆214Updated 2 weeks ago
- A simple library for scaling up JAX programs☆144Updated last month
- Tokamax: A GPU and TPU kernel library.☆133Updated this week
- A FlashAttention implementation for JAX with support for efficient document mask computation and context parallelism.☆152Updated last month
- Named Tensors for Legible Deep Learning in JAX☆215Updated last month
- a Jax quantization library☆74Updated last week
- JAX bindings for Flash Attention v2☆101Updated last month
- seqax = sequence modeling + JAX☆169Updated 4 months ago
- JAX-Toolbox☆368Updated this week
- Minimal but scalable implementation of large language models in JAX☆35Updated 3 weeks ago
- Orbax provides common checkpointing and persistence utilities for JAX users☆469Updated this week
- jax-triton contains integrations between JAX and OpenAI Triton☆436Updated last week
- ☆285Updated last year
- Two implementations of ZeRO-1 optimizer sharding in JAX☆14Updated 2 years ago
- LoRA for arbitrary JAX models and functions☆143Updated last year
- Dion optimizer algorithm☆404Updated this week
- 🧱 Modula software package☆316Updated 4 months ago
- Run PyTorch in JAX. 🤝☆309Updated 2 months ago
- ☆251Updated last week
- Minimal, lightweight JAX implementations of popular models.☆171Updated last week
- Distributed pretraining of large language models (LLMs) on cloud TPU slices, with Jax and Equinox.☆24Updated last year
- Accelerated First Order Parallel Associative Scan☆193Updated last year
- OpTree: Optimized PyTree Utilities☆202Updated this week
- A library for unit scaling in PyTorch☆132Updated 5 months ago
- Pytorch-like dataloaders for JAX.☆97Updated this week
- ☆122Updated 6 months ago
- JMP is a Mixed Precision library for JAX.☆211Updated 10 months ago
- FlashRNN - Fast RNN Kernels with I/O Awareness☆173Updated 2 months ago
- Einsum-like high-level array sharding API for JAX☆34Updated last year
- If it quacks like a tensor...☆59Updated last year