google-deepmind / dm_pix
PIX is an image processing library in JAX, for JAX.
☆389Updated last week
Related projects ⓘ
Alternatives and complementary repositories for dm_pix
- CLU lets you write beautiful training loops in JAX.☆321Updated this week
- ☆788Updated this week
- ☆153Updated 11 months ago
- ☆536Updated 2 months ago
- A Pytree Module system for Deep Learning in JAX☆214Updated last year
- Named tensors with first-class dimensions for PyTorch☆322Updated last year
- A High Level API for Deep Learning in JAX☆470Updated last year
- Orbax provides common checkpointing and persistence utilities for JAX users☆303Updated this week
- ☆303Updated this week
- JMP is a Mixed Precision library for JAX.☆187Updated 6 months ago
- Unofficial JAX implementations of deep learning research papers☆151Updated 2 years ago
- ☆105Updated 2 weeks ago
- Mathematical operations for JAX pytrees☆189Updated 6 months ago
- ☆328Updated this week
- Run PyTorch in JAX. 🤝☆200Updated last year
- Oryx is a library for probabilistic programming and deep learning built on top of Jax.☆219Updated this week
- Pretrained deep learning models for Jax/Flax: StyleGAN2, GPT2, VGG, ResNet, etc.☆238Updated last year
- A Python package of computer vision models for the Equinox ecosystem.☆102Updated 4 months ago
- jax-triton contains integrations between JAX and OpenAI Triton☆343Updated 3 weeks ago
- A functional training loops library for JAX☆85Updated 9 months ago
- JAX Synergistic Memory Inspector☆164Updated 4 months ago
- Second Order Optimization and Curvature Estimation with K-FAC in JAX.☆249Updated this week
- Turn SymPy expressions into trainable JAX expressions.☆322Updated 7 months ago
- This library would form a permanent home for reusable components for deep probabilistic programming. The library would form and harness a…☆301Updated 3 weeks ago
- Implementations and checkpoints for ResNet, Wide ResNet, ResNeXt, ResNet-D, and ResNeSt in JAX (Flax).☆104Updated 2 years ago
- A library for distributed ML training with PyTorch☆366Updated last year
- Hardware accelerated, batchable and differentiable optimizers in JAX.☆933Updated 2 months ago
- Memory mapped numpy arrays of varying shapes☆285Updated 5 months ago
- MLCommons Algorithmic Efficiency is a benchmark and competition measuring neural network training speedups due to algorithmic improvement…☆333Updated 3 weeks ago
- Swarm training framework using Haiku + JAX + Ray for layer parallel transformer language models on unreliable, heterogeneous nodes☆237Updated last year