evanatyourservice / psgd_jaxLinks
Implementation of PSGD optimizer in JAX
☆33Updated 5 months ago
Alternatives and similar repositories for psgd_jax
Users that are interested in psgd_jax are comparing it to the libraries listed below
Sorting:
- A simple library for scaling up JAX programs☆137Updated 7 months ago
- Turn jitted jax functions back into python source code☆22Updated 5 months ago
- LoRA for arbitrary JAX models and functions☆135Updated last year
- Minimal but scalable implementation of large language models in JAX☆34Updated 7 months ago
- ☆17Updated 9 months ago
- Pytorch-like dataloaders for JAX.☆83Updated last week
- Maximal Update Parametrization (μP) with Flax & Optax.☆11Updated last year
- supporting pytorch FSDP for optimizers☆79Updated 5 months ago
- Flow-matching algorithms in JAX☆91Updated 9 months ago
- ☆116Updated last week
- 🧱 Modula software package☆194Updated 2 months ago
- Unofficial but Efficient Implementation of "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" in JAX☆81Updated last year
- If it quacks like a tensor...☆58Updated 6 months ago
- Jax/Flax rewrite of Karpathy's nanoGPT☆57Updated 2 years ago
- Einsum-like high-level array sharding API for JAX☆34Updated 10 months ago
- JAX Arrays for human consumption☆91Updated last year
- Scalable and Stable Parallelization of Nonlinear RNNS☆15Updated 4 months ago
- Lightning-like training API for JAX with Flax☆38Updated 5 months ago
- seqax = sequence modeling + JAX☆155Updated last month
- Run PyTorch in JAX. 🤝☆245Updated 3 months ago
- Pytorch implementation of preconditioned stochastic gradient descent (Kron and affine preconditioner, low-rank approximation precondition…☆175Updated last week
- Automatically take good care of your preemptible TPUs☆36Updated 2 years ago
- ☆267Updated 10 months ago
- ☆185Updated 6 months ago
- ☆39Updated last year
- ☆53Updated last year
- ☆28Updated 6 months ago
- A port of muP to JAX/Haiku☆25Updated 2 years ago
- JAX implementation of VQVAE/VQGAN autoencoders (+FSQ)☆29Updated 11 months ago
- Distributed pretraining of large language models (LLMs) on cloud TPU slices, with Jax and Equinox.☆24Updated 8 months ago