google-deepmind / nanodo
☆214Updated 8 months ago
Alternatives and similar repositories for nanodo:
Users that are interested in nanodo are comparing it to the libraries listed below
- seqax = sequence modeling + JAX☆148Updated this week
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆555Updated this week
- A simple library for scaling up JAX programs☆134Updated 4 months ago
- 🧱 Modula software package☆172Updated last week
- LoRA for arbitrary JAX models and functions☆135Updated last year
- Named Tensors for Legible Deep Learning in JAX☆167Updated this week
- JAX Synergistic Memory Inspector☆170Updated 8 months ago
- Cost aware hyperparameter tuning algorithm☆147Updated 8 months ago
- ☆138Updated this week
- jax-triton contains integrations between JAX and OpenAI Triton☆384Updated this week
- A MAD laboratory to improve AI architecture designs 🧪☆107Updated 3 months ago
- Run PyTorch in JAX. 🤝☆231Updated last month
- Inference code for LLaMA models in JAX☆116Updated 10 months ago
- Jax/Flax rewrite of Karpathy's nanoGPT☆57Updated 2 years ago
- ☆220Updated last month
- Efficient optimizers☆183Updated last week
- Orbax provides common checkpointing and persistence utilities for JAX users☆348Updated this week
- JAX implementation of the Llama 2 model☆216Updated last year
- The simplest, fastest repository for training/finetuning medium-sized GPTs.☆100Updated 4 months ago
- CLU lets you write beautiful training loops in JAX.☆335Updated 2 weeks ago
- Understand and test language model architectures on synthetic tasks.☆184Updated 2 weeks ago
- Minimal (400 LOC) implementation Maximum (multi-node, FSDP) GPT training☆122Updated 11 months ago
- JMP is a Mixed Precision library for JAX.☆193Updated last month
- Accelerated First Order Parallel Associative Scan☆175Updated 7 months ago
- Minimal but scalable implementation of large language models in JAX☆34Updated 4 months ago
- supporting pytorch FSDP for optimizers☆79Updated 3 months ago
- Distributed pretraining of large language models (LLMs) on cloud TPU slices, with Jax and Equinox.☆24Updated 5 months ago
- ☆76Updated 8 months ago
- Library for reading and processing ML training data.☆407Updated this week