google / torchaxLinks
torchax is a PyTorch frontend for JAX. It gives JAX the ability to author JAX programs using familiar PyTorch syntax. It also provides JAX-Pytorch interoperability, meaning, one can mix JAX & Pytorch syntax together when authoring ML programs, and run it in every hardware JAX can run.
☆175Updated this week
Alternatives and similar repositories for torchax
Users that are interested in torchax are comparing it to the libraries listed below
Sorting:
- Minimal yet performant LLM examples in pure JAX☆236Updated 3 weeks ago
- Tokamax: A GPU and TPU kernel library.☆169Updated last week
- A simple library for scaling up JAX programs☆145Updated 3 months ago
- A FlashAttention implementation for JAX with support for efficient document mask computation and context parallelism.☆157Updated 2 months ago
- JAX-Toolbox☆382Updated this week
- Named Tensors for Legible Deep Learning in JAX☆218Updated 3 months ago
- jax-triton contains integrations between JAX and OpenAI Triton☆437Updated last month
- Minimal, lightweight JAX implementations of popular models.☆187Updated this week
- 🧱 Modula software package☆322Updated 5 months ago
- OpTree: Optimized PyTree Utilities☆205Updated this week
- a Jax quantization library☆90Updated this week
- Orbax provides common checkpointing and persistence utilities for JAX users☆479Updated this week
- Dion optimizer algorithm☆424Updated 3 weeks ago
- seqax = sequence modeling + JAX☆170Updated 6 months ago
- ☆289Updated last year
- ☆265Updated this week
- JMP is a Mixed Precision library for JAX.☆211Updated last year
- A library for unit scaling in PyTorch☆133Updated 6 months ago
- Einsum-like high-level array sharding API for JAX☆34Updated last year
- Run PyTorch in JAX. 🤝☆312Updated 3 months ago
- Minimal but scalable implementation of large language models in JAX☆35Updated 2 months ago
- Library for reading and processing ML training data.☆677Updated last week
- A Python-embedded DSL that makes it easy to write fast, scalable ML kernels with minimal boilerplate.☆739Updated this week
- JAX bindings for Flash Attention v2☆103Updated this week
- Small scale distributed training of sequential deep learning models, built on Numpy and MPI.☆155Updated 2 years ago
- LoRA for arbitrary JAX models and functions☆144Updated last year
- Pytorch-like dataloaders for JAX.☆98Updated last month
- Fault tolerance for PyTorch (HSDP, LocalSGD, DiLoCo, Streaming DiLoCo)☆475Updated this week
- ☆120Updated last week
- Implementation of Diffusion Transformer (DiT) in JAX☆306Updated last year