JesseFarebro / flax-mupLinks
Maximal Update Parametrization (μP) with Flax & Optax.
☆16Updated 2 years ago
Alternatives and similar repositories for flax-mup
Users that are interested in flax-mup are comparing it to the libraries listed below
Sorting:
- A simple library for scaling up JAX programs☆144Updated 3 months ago
- LoRA for arbitrary JAX models and functions☆144Updated last year
- Minimal but scalable implementation of large language models in JAX☆35Updated 2 months ago
- Lightning-like training API for JAX with Flax☆45Updated last year
- ☆123Updated 7 months ago
- Pytorch-like dataloaders for JAX.☆98Updated last month
- supporting pytorch FSDP for optimizers☆84Updated last year
- Implementation of PSGD optimizer in JAX☆35Updated last year
- ☆18Updated last year
- Flow-matching algorithms in JAX☆114Updated last year
- Unofficial but Efficient Implementation of "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" in JAX☆92Updated 2 years ago
- Minimal yet performant LLM examples in pure JAX☆236Updated 3 weeks ago
- ☆35Updated last year
- ☆62Updated last year
- Jax/Flax rewrite of Karpathy's nanoGPT☆63Updated 2 years ago
- Einsum-like high-level array sharding API for JAX☆34Updated last year
- 🧱 Modula software package☆322Updated 5 months ago
- If it quacks like a tensor...☆59Updated last year
- nanoGPT using Equinox☆15Updated 2 years ago
- ☆289Updated last year
- ☆70Updated last year
- A repo based on XiLin Li's PSGD repo that extends some of the experiments.☆14Updated last year
- Accelerated First Order Parallel Associative Scan☆196Updated 3 weeks ago
- Minimal, lightweight JAX implementations of popular models.☆180Updated last week
- ☆238Updated last year
- Implementation of Gradient Agreement Filtering, from Chaubard et al. of Stanford, but for single machine microbatches, in Pytorch☆25Updated last year
- A simple, easy-to-understand library for diffusion models using Flax and Jax. Includes detailed notebooks on DDPM, DDIM, and EDM with sim…☆40Updated 8 months ago
- A flexible and efficient implementation of Flash Attention 2.0 for JAX, supporting multiple backends (GPU/TPU/CPU) and platforms (Triton/…☆34Updated 11 months ago
- JAX implementation of the Mistral 7b v0.2 model☆35Updated last year
- ☆92Updated last year