willisma / diffuse_nnxLinks
A comprehensive JAX/NNX library for diffusion and flow matching generative algorithms, featuring DiT (Diffusion Transformer) and its variants as the primary backbone with support for ImageNet training and various sampling strategies.
☆128Updated 3 months ago
Alternatives and similar repositories for diffuse_nnx
Users that are interested in diffuse_nnx are comparing it to the libraries listed below
Sorting:
- A convenient way to trigger synchronizations to wandb / Weights & Biases if your compute nodes don't have internet!☆88Updated 3 weeks ago
- Flow-matching algorithms in JAX☆114Updated last year
- ☆123Updated 7 months ago
- Official Jax Implementation of MD4 Masked Diffusion Models☆152Updated 11 months ago
- ☆176Updated 3 months ago
- Reward fine-tuning for Stable Diffusion models based on stochastic optimal control, including Adjoint Matching☆61Updated 7 months ago
- Flash Attention Triton kernel with support for second-order derivatives☆133Updated last month
- A simple library for scaling up JAX programs☆144Updated 2 months ago
- A general framework for inference-time scaling and steering of diffusion models with arbitrary rewards.☆205Updated 7 months ago
- Official codebase for the paper "How to build a consistency model: Learning flow maps via self-distillation" (NeurIPS 2025).☆70Updated 3 months ago
- My take on Flow Matching☆90Updated last year
- Definitive implementation of the stochastic interpolant framework for generative modeling in jax.☆40Updated 6 months ago
- ☆83Updated last year
- [ICLR'25] Artificial Kuramoto Oscillatory Neurons☆106Updated 3 months ago
- Implementation of PSGD optimizer in JAX☆35Updated last year
- Use Jax functions in Pytorch☆259Updated 2 years ago
- JAX reimplementation of the DeepMind paper "Genie: Generative Interactive Environments"☆98Updated last year
- Modern Fixed Point Systems using Pytorch☆125Updated 2 years ago
- Official Code for Paper "Think While You Generate: Discrete Diffusion with Planned Denoising" [ICLR 2025]☆84Updated 9 months ago
- WIP☆93Updated last year
- NF-Layers for constructing neural functionals.☆93Updated 2 years ago
- Maximal Update Parametrization (μP) with Flax & Optax.☆16Updated 2 years ago
- ☆27Updated 3 months ago
- ☆204Updated last year
- Minimal but scalable implementation of large language models in JAX☆35Updated 2 months ago
- Run PyTorch in JAX. 🤝☆311Updated 3 months ago
- Implementation of Denoising Diffusion Probabilistic Models (DDPM) in JAX and Flax.☆22Updated 2 years ago
- ☆237Updated last year
- Official implementation of Stochastic Taylor Derivative Estimator (STDE) NeurIPS2024☆127Updated last year
- Pytorch-like dataloaders for JAX.☆98Updated last month