willisma / diffuse_nnxLinks
A comprehensive JAX/NNX library for diffusion and flow matching generative algorithms, featuring DiT (Diffusion Transformer) and its variants as the primary backbone with support for ImageNet training and various sampling strategies.
☆125Updated 2 months ago
Alternatives and similar repositories for diffuse_nnx
Users that are interested in diffuse_nnx are comparing it to the libraries listed below
Sorting:
- Flow-matching algorithms in JAX☆113Updated last year
- ☆122Updated 6 months ago
- A convenient way to trigger synchronizations to wandb / Weights & Biases if your compute nodes don't have internet!☆88Updated 3 weeks ago
- Official Jax Implementation of MD4 Masked Diffusion Models☆150Updated 10 months ago
- A simple library for scaling up JAX programs☆144Updated 2 months ago
- Implementation of PSGD optimizer in JAX☆35Updated last year
- Use Jax functions in Pytorch☆259Updated 2 years ago
- NF-Layers for constructing neural functionals.☆93Updated 2 years ago
- Modern Fixed Point Systems using Pytorch☆125Updated 2 years ago
- [ICLR'25] Artificial Kuramoto Oscillatory Neurons☆106Updated 2 months ago
- Official codebase for the paper "How to build a consistency model: Learning flow maps via self-distillation" (NeurIPS 2025).☆63Updated 2 months ago
- Definitive implementation of the stochastic interpolant framework for generative modeling in jax.☆39Updated 5 months ago
- My take on Flow Matching☆89Updated 11 months ago
- Reward fine-tuning for Stable Diffusion models based on stochastic optimal control, including Adjoint Matching☆59Updated 7 months ago
- ☆171Updated 2 months ago
- Pytorch-like dataloaders for JAX.☆97Updated 2 weeks ago
- ☆82Updated last year
- JAX reimplementation of the DeepMind paper "Genie: Generative Interactive Environments"☆97Updated 11 months ago
- A general framework for inference-time scaling and steering of diffusion models with arbitrary rewards.☆202Updated 6 months ago
- Run PyTorch in JAX. 🤝☆310Updated 2 months ago
- Maximal Update Parametrization (μP) with Flax & Optax.☆16Updated 2 years ago
- ☆27Updated 3 months ago
- ☆200Updated last year
- WIP☆93Updated last year
- Flash Attention Triton kernel with support for second-order derivatives☆126Updated 2 weeks ago
- ☆34Updated 3 months ago
- Implementation of Denoising Diffusion Probabilistic Models (DDPM) in JAX and Flax.☆22Updated 2 years ago
- Implementation of Diffusion Transformer (DiT) in JAX☆300Updated last year
- [ICML 2023] Reflected Diffusion Models (https://arxiv.org/abs/2304.04740)☆158Updated 2 years ago
- ☆36Updated this week