AshishKumar4 / FlaxDiffLinks
A simple, easy-to-understand library for diffusion models using Flax and Jax. Includes detailed notebooks on DDPM, DDIM, and EDM with simplified mathematical explanations. Made as part of my journey for learning and experimenting with generative AI.
☆29Updated 3 months ago
Alternatives and similar repositories for FlaxDiff
Users that are interested in FlaxDiff are comparing it to the libraries listed below
Sorting:
- ☆115Updated last month
- ☆31Updated 8 months ago
- Flow-matching algorithms in JAX☆100Updated 11 months ago
- ☆65Updated 8 months ago
- Maximal Update Parametrization (μP) with Flax & Optax.☆16Updated last year
- Implementation of Gradient Agreement Filtering, from Chaubard et al. of Stanford, but for single machine microbatches, in Pytorch☆25Updated 6 months ago
- Unofficial but Efficient Implementation of "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" in JAX☆85Updated last year
- Explorations into the recently proposed Taylor Series Linear Attention☆100Updated 11 months ago
- 📄Small Batch Size Training for Language Models☆41Updated this week
- The Gaussian Histogram Loss (HL-Gauss) proposed by Imani et al. with a few convenient wrappers for regression, in Pytorch☆65Updated 2 months ago
- Implementation of numerous Vision Transformers in Google's JAX and Flax.☆22Updated 2 years ago
- A State-Space Model with Rational Transfer Function Representation.☆79Updated last year
- The official repository for the paper "Optimal Flow Matching: Learning Straight Trajectories in Just One Step" (NeurIPS 2024)☆84Updated 7 months ago
- Implementation of Diffusion Transformers and Rectified Flow in Jax☆25Updated last year
- ☆206Updated 8 months ago
- LoRA for arbitrary JAX models and functions☆140Updated last year
- Exploration into the Scaling Value Iteration Networks paper, from Schmidhuber's group☆36Updated 10 months ago
- Implementation of a multimodal diffusion transformer in Pytorch☆102Updated last year
- Implementing the Denoising Diffusion Probabilistic Model in Flax☆150Updated 2 years ago
- Lightning-like training API for JAX with Flax☆42Updated 8 months ago
- ☆53Updated 10 months ago
- ☆33Updated last year
- Minimal (400 LOC) implementation Maximum (multi-node, FSDP) GPT training☆130Updated last year
- Implementation of GateLoop Transformer in Pytorch and Jax☆89Updated last year
- Diffusion models in PyTorch☆107Updated last month
- Simple implementation of muP, based on Spectral Condition for Feature Learning. The implementation is SGD only, dont use it for Adam☆84Updated last year
- Tiny re-implementation of MDM in style of LLaDA and nano-gpt speedrun☆55Updated 4 months ago
- Implementation of the proposed Adam-atan2 from Google Deepmind in Pytorch☆115Updated 8 months ago
- Easy Hypernetworks in Pytorch and Jax☆103Updated 2 years ago
- A repo where I play with conditional flow approaches for learning time-varying vector-fields.☆21Updated last year