AshishKumar4 / FlaxDiffLinks
A simple, easy-to-understand library for diffusion models using Flax and Jax. Includes detailed notebooks on DDPM, DDIM, and EDM with simplified mathematical explanations. Made as part of my journey for learning and experimenting with generative AI.
☆30Updated 4 months ago
Alternatives and similar repositories for FlaxDiff
Users that are interested in FlaxDiff are comparing it to the libraries listed below
Sorting:
- ☆116Updated 2 months ago
- Maximal Update Parametrization (μP) with Flax & Optax.☆16Updated last year
- Flow-matching algorithms in JAX☆104Updated last year
- Implementation of Gradient Agreement Filtering, from Chaubard et al. of Stanford, but for single machine microbatches, in Pytorch☆25Updated 7 months ago
- ☆32Updated 9 months ago
- Implementation of Diffusion Transformers and Rectified Flow in Jax☆25Updated last year
- Lightning-like training API for JAX with Flax☆42Updated 8 months ago
- Simple implementation of muP, based on Spectral Condition for Feature Learning. The implementation is SGD only, dont use it for Adam☆85Updated last year
- A State-Space Model with Rational Transfer Function Representation.☆79Updated last year
- Unofficial but Efficient Implementation of "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" in JAX☆88Updated last year
- ☆65Updated 9 months ago
- Implementing the Denoising Diffusion Probabilistic Model in Flax☆150Updated 2 years ago
- A simple library for scaling up JAX programs☆143Updated 10 months ago
- Diffusion models in PyTorch☆107Updated 2 months ago
- Implementation of numerous Vision Transformers in Google's JAX and Flax.☆22Updated 3 years ago
- Implementation of the proposed Adam-atan2 from Google Deepmind in Pytorch☆123Updated 9 months ago
- ☆24Updated 8 months ago
- ☆57Updated 11 months ago
- LoRA for arbitrary JAX models and functions☆142Updated last year
- Explorations into the recently proposed Taylor Series Linear Attention☆100Updated last year
- Easy Hypernetworks in Pytorch and Jax☆104Updated 2 years ago
- ☆34Updated last year
- Pytorch-like dataloaders for JAX.☆94Updated 3 months ago
- supporting pytorch FSDP for optimizers☆84Updated 9 months ago
- research impl of Native Sparse Attention (2502.11089)☆60Updated 6 months ago
- FID computation in Jax/Flax.☆28Updated last year
- Implementation of PSGD optimizer in JAX☆34Updated 8 months ago
- The official repository for the paper "Optimal Flow Matching: Learning Straight Trajectories in Just One Step" (NeurIPS 2024)☆89Updated 8 months ago
- The 2D discrete wavelet transform for JAX☆43Updated 2 years ago
- Minimal (400 LOC) implementation Maximum (multi-node, FSDP) GPT training☆132Updated last year