AshishKumar4 / FlaxDiffLinks
A simple, easy-to-understand library for diffusion models using Flax and Jax. Includes detailed notebooks on DDPM, DDIM, and EDM with simplified mathematical explanations. Made as part of my journey for learning and experimenting with generative AI.
☆40Updated 7 months ago
Alternatives and similar repositories for FlaxDiff
Users that are interested in FlaxDiff are comparing it to the libraries listed below
Sorting:
- ☆122Updated 6 months ago
- Flow-matching algorithms in JAX☆112Updated last year
- ☆35Updated last year
- Maximal Update Parametrization (μP) with Flax & Optax.☆16Updated 2 years ago
- ☆69Updated last year
- Implementation of Gradient Agreement Filtering, from Chaubard et al. of Stanford, but for single machine microbatches, in Pytorch☆25Updated 11 months ago
- Implementation of Diffusion Transformers and Rectified Flow in Jax☆27Updated last year
- Explorations into the recently proposed Taylor Series Linear Attention☆100Updated last year
- Implementation of PSGD optimizer in JAX☆35Updated last year
- ☆24Updated last year
- Simple implementation of muP, based on Spectral Condition for Feature Learning. The implementation is SGD only, dont use it for Adam☆84Updated last year
- Implementing the Denoising Diffusion Probabilistic Model in Flax☆156Updated 3 years ago
- The official repository for the paper "Optimal Flow Matching: Learning Straight Trajectories in Just One Step" (NeurIPS 2024)☆101Updated last year
- Diffusion models in PyTorch☆120Updated last week
- Code release for "Stochastic Optimal Control Matching"☆39Updated last year
- A simple library for scaling up JAX programs☆144Updated last month
- Unofficial but Efficient Implementation of "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" in JAX☆92Updated last year
- My take on Flow Matching☆86Updated 11 months ago
- The Gaussian Histogram Loss (HL-Gauss) proposed by Imani et al. with a few convenient wrappers for regression, in Pytorch☆70Updated last month
- Exploration into the proposed "Self Reasoning Tokens" by Felipe Bonetto☆57Updated last year
- Easy Hypernetworks in Pytorch and Jax☆106Updated 2 years ago
- Supporting code for the blog post on modular manifolds.☆108Updated 3 months ago
- ☆44Updated last year
- Scalable and Stable Parallelization of Nonlinear RNNS☆28Updated 2 months ago
- ☆35Updated last year
- LoRA for arbitrary JAX models and functions☆143Updated last year
- ☆62Updated last year
- Lightning-like training API for JAX with Flax☆44Updated last year
- Code for the paper "Function-Space Learning Rates"☆23Updated 6 months ago
- Flash Attention Triton kernel with support for second-order derivatives☆125Updated last week