Flax is a neural network library for JAX that is designed for flexibility.
☆7,112Mar 14, 2026Updated last week
Alternatives and similar repositories for flax
Users that are interested in flax are comparing it to the libraries listed below
Sorting:
- JAX-based neural network library☆3,197Mar 12, 2026Updated last week
- Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more☆35,108Updated this week
- Optax is a gradient processing and optimization library for JAX.☆2,212Updated this week
- Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/☆2,821Mar 9, 2026Updated last week
- ☆929Mar 12, 2026Updated last week
- ☆1,411Mar 2, 2026Updated 2 weeks ago
- JAX - A curated list of resources https://github.com/google/jax☆2,074Jan 20, 2026Updated 2 months ago
- Hardware accelerated, batchable and differentiable optimizers in JAX.☆1,030Dec 17, 2025Updated 3 months ago
- A Graph Neural Network Library in Jax☆1,465Mar 18, 2024Updated 2 years ago
- Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)☆9,430Feb 20, 2026Updated last month
- Orbax provides common checkpointing and persistence utilities for JAX users☆489Mar 14, 2026Updated last week
- Probabilistic programming with NumPy powered by JAX for autograd and JIT compilation to GPU/TPU/CPU.☆2,623Updated this week
- Development repository for the Triton language and compiler☆18,656Mar 14, 2026Updated last week
- Trax — Deep Learning with Clear Code and Speed☆8,299Sep 26, 2025Updated 5 months ago
- functorch is JAX-like composable function transforms for PyTorch.☆1,437Aug 21, 2025Updated 7 months ago
- CLU lets you write beautiful training loops in JAX.☆367Mar 3, 2026Updated 2 weeks ago
- Type annotations and runtime checking for shape and dtype of JAX/NumPy/PyTorch/etc. arrays. https://docs.kidger.site/jaxtyping/☆1,757Feb 16, 2026Updated last month
- Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.☆30,926Mar 10, 2026Updated last week
- ☆774Jan 27, 2024Updated 2 years ago
- A High Level API for Deep Learning in JAX☆476Dec 15, 2022Updated 3 years ago
- Efficiently computes derivatives of NumPy code.☆7,464Updated this week
- Library for reading and processing ML training data.☆694Updated this week
- A simple, performant and scalable Jax LLM!☆2,170Updated this week
- Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable. https://docs.kidger.site/diffrax/☆1,933Feb 23, 2026Updated 3 weeks ago
- ☆627Jan 23, 2026Updated last month
- A machine learning compiler for GPUs, CPUs, and ML accelerators☆4,071Updated this week
- 🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (i…☆9,563Updated this week
- A JAX research toolkit for building, editing, and visualizing neural networks.☆1,873Jun 22, 2025Updated 8 months ago
- Massively parallel rigidbody physics simulation on accelerator hardware.☆3,094Updated this week
- PyTorch extensions for high performance and large scale training.☆3,403Apr 26, 2025Updated 10 months ago
- Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimenta…☆551Updated this week
- PIX is an image processing library in JAX, for JAX.☆434Mar 6, 2025Updated last year
- Fast and Easy Infinite Neural Networks in Python☆2,377Mar 1, 2024Updated 2 years ago
- Ray is an AI compute engine. Ray consists of a core distributed runtime and a set of AI Libraries for accelerating ML workloads.☆41,799Updated this week
- BlackJAX is a Bayesian Inference library designed for ease of use, speed and modularity.☆1,033Feb 3, 2026Updated last month
- Zero-copy MPI communication of JAX arrays, for turbo-charged HPC applications in Python☆519Mar 2, 2026Updated 2 weeks ago
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.☆41,807Mar 13, 2026Updated last week
- Hackable and optimized Transformers building blocks, supporting a composable construction.☆10,373Updated this week
- A library of reinforcement learning components and agents☆3,946Mar 12, 2026Updated last week