willisma / diffuse_nnxLinks
A comprehensive JAX/NNX library for diffusion and flow matching generative algorithms, featuring DiT (Diffusion Transformer) and its variants as the primary backbone with support for ImageNet training and various sampling strategies.
☆117Updated last month
Alternatives and similar repositories for diffuse_nnx
Users that are interested in diffuse_nnx are comparing it to the libraries listed below
Sorting:
- Flow-matching algorithms in JAX☆112Updated last year
- ☆121Updated 6 months ago
- Official codebase for the paper "How to build a consistency model: Learning flow maps via self-distillation" (NeurIPS 2025).☆55Updated 2 months ago
- Official Jax Implementation of MD4 Masked Diffusion Models☆147Updated 9 months ago
- A convenient way to trigger synchronizations to wandb / Weights & Biases if your compute nodes don't have internet!☆87Updated last week
- Reward fine-tuning for Stable Diffusion models based on stochastic optimal control, including Adjoint Matching☆55Updated 6 months ago
- Definitive implementation of the stochastic interpolant framework for generative modeling in jax.☆38Updated 4 months ago
- Use Jax functions in Pytorch☆258Updated 2 years ago
- Maximal Update Parametrization (μP) with Flax & Optax.☆16Updated last year
- Implementation of PSGD optimizer in JAX☆35Updated 11 months ago
- Modern Fixed Point Systems using Pytorch☆125Updated 2 years ago
- ☆28Updated 2 months ago
- ☆168Updated 2 months ago
- A simple library for scaling up JAX programs☆144Updated last month
- ☆35Updated last week
- NF-Layers for constructing neural functionals.☆91Updated last year
- A general framework for inference-time scaling and steering of diffusion models with arbitrary rewards.☆199Updated 5 months ago
- [ICLR'25] Artificial Kuramoto Oscillatory Neurons☆105Updated last month
- ☆196Updated last year
- Pytorch-like dataloaders for JAX.☆98Updated 6 months ago
- Run PyTorch in JAX. 🤝☆309Updated last month
- My take on Flow Matching☆85Updated 11 months ago
- Implementation of Denoising Diffusion Probabilistic Models (DDPM) in JAX and Flax.☆22Updated 2 years ago
- Flash Attention Triton kernel with support for second-order derivatives☆117Updated last month
- Lightning-like training API for JAX with Flax☆44Updated last year
- ☆78Updated last year
- code for "Adjoint Sampling: Highly Scalable Diffusion Samplers via Adjoint Matching"☆128Updated 4 months ago
- code for "Riemannian Flow Matching on General Geometries".☆275Updated last year
- Diffusion models in PyTorch☆116Updated last week
- ☆225Updated last year