JesseFarebro / flax-mupLinks
Maximal Update Parametrization (μP) with Flax & Optax.
☆16Updated last year
Alternatives and similar repositories for flax-mup
Users that are interested in flax-mup are comparing it to the libraries listed below
Sorting:
- A simple library for scaling up JAX programs☆143Updated 11 months ago
- Minimal but scalable implementation of large language models in JAX☆35Updated last month
- Implementation of PSGD optimizer in JAX☆33Updated 9 months ago
- LoRA for arbitrary JAX models and functions☆142Updated last year
- Pytorch-like dataloaders for JAX.☆94Updated 4 months ago
- ☆120Updated 3 months ago
- Minimal yet performant LLM examples in pure JAX☆177Updated last week
- Flow-matching algorithms in JAX☆105Updated last year
- Minimal, lightweight JAX implementations of popular models.☆109Updated this week
- Lightning-like training API for JAX with Flax☆42Updated 9 months ago
- Jax/Flax rewrite of Karpathy's nanoGPT☆60Updated 2 years ago
- ☆33Updated 10 months ago
- supporting pytorch FSDP for optimizers☆84Updated 9 months ago
- Einsum-like high-level array sharding API for JAX☆35Updated last year
- ☆281Updated last year
- The simplest, fastest repository for training/finetuning medium-sized GPTs.☆36Updated last year
- Supporting code for the blog post on modular manifolds.☆39Updated last week
- 🧱 Modula software package☆277Updated last month
- nanoGPT using Equinox☆13Updated 2 years ago
- ☆17Updated last year
- Unofficial but Efficient Implementation of "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" in JAX☆89Updated last year
- If it quacks like a tensor...☆59Updated 10 months ago
- This is a port of Mistral-7B model in JAX☆32Updated last year
- Implementation of Denoising Diffusion Probabilistic Models (DDPM) in JAX and Flax.☆20Updated last year
- ☆58Updated last year
- JAX Arrays for human consumption☆106Updated 3 months ago
- ☆67Updated 10 months ago
- Scalable and Stable Parallelization of Nonlinear RNNS☆22Updated last month
- A functional training loops library for JAX☆88Updated last year
- Distributed pretraining of large language models (LLMs) on cloud TPU slices, with Jax and Equinox.☆24Updated last year