google-research / kauldronLinks
Modular, scalable library to train ML models
☆203Updated this week
Alternatives and similar repositories for kauldron
Users that are interested in kauldron are comparing it to the libraries listed below
Sorting:
- a Jax quantization library☆87Updated this week
- For optimization algorithm research and development.☆558Updated 2 weeks ago
- A Jax-based library for building transformers, includes implementations of GPT, Gemma, LlaMa, Mixtral, Whisper, SWin, ViT and more.☆298Updated last year
- ☆192Updated 2 weeks ago
- An implementation of PSGD Kron second-order optimizer for PyTorch☆98Updated 6 months ago
- ☆300Updated last week
- ☆289Updated last year
- Scalable and Performant Data Loading☆362Updated last week
- Minimal yet performant LLM examples in pure JAX☆233Updated 2 weeks ago
- JAX-Toolbox☆381Updated this week
- 🧱 Modula software package☆322Updated 5 months ago
- ☆314Updated last year
- JAX Implementation of Black Forest Labs' Flux.1 family of models☆40Updated 2 months ago
- A FlashAttention implementation for JAX with support for efficient document mask computation and context parallelism.☆157Updated 2 months ago
- Library for reading and processing ML training data.☆670Updated last week
- Accelerate, Optimize performance with streamlined training and serving options with JAX.☆333Updated 3 weeks ago
- ☆120Updated this week
- Dion optimizer algorithm☆420Updated 2 weeks ago
- ☆214Updated this week
- Google TPU optimizations for transformers models☆133Updated last week
- A simple library for scaling up JAX programs☆144Updated 2 months ago
- ☆162Updated 3 months ago
- ☆82Updated last year
- torchax is a PyTorch frontend for JAX. It gives JAX the ability to author JAX programs using familiar PyTorch syntax. It also provides JA…☆171Updated this week
- ☆152Updated 4 months ago
- Unofficial JAX implementations of deep learning research papers☆161Updated 3 years ago
- Minimal (400 LOC) implementation Maximum (multi-node, FSDP) GPT training☆132Updated last year
- Implementation of Diffusion Transformer (DiT) in JAX☆305Updated last year
- Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimenta…☆548Updated 2 weeks ago
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆694Updated this week