cgarciae / nanoGPT-jax
The simplest, fastest repository for training/finetuning medium-sized GPTs.
☆32Updated last year
Alternatives and similar repositories for nanoGPT-jax:
Users that are interested in nanoGPT-jax are comparing it to the libraries listed below
- A functional training loops library for JAX☆86Updated last year
- JAX implementations of core Deep RL algorithms☆79Updated 2 years ago
- A collection of meta-learning algorithms in Jax☆22Updated 2 years ago
- A metrics library for the JAX ecosystem☆40Updated last year
- Turn jitted jax functions back into python source code☆22Updated 2 months ago
- LoRA for arbitrary JAX models and functions☆135Updated last year
- Accelerated replay buffers in JAX☆41Updated 2 years ago
- Unofficial but Efficient Implementation of "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" in JAX☆83Updated last year
- Lightning-like training API for JAX with Flax☆38Updated 3 months ago
- A port of muP to JAX/Haiku☆25Updated 2 years ago
- This is a port of Mistral-7B model in JAX☆32Updated 8 months ago
- Neural Networks for JAX☆83Updated 5 months ago
- A small library for creating and manipulating custom JAX Pytree classes