imbue-ai / carbsLinks
Cost aware hyperparameter tuning algorithm
☆168Updated last year
Alternatives and similar repositories for carbs
Users that are interested in carbs are comparing it to the libraries listed below
Sorting:
- ☆281Updated last year
- seqax = sequence modeling + JAX☆167Updated 2 months ago
- 🧱 Modula software package☆277Updated last month
- Minimal yet performant LLM examples in pure JAX☆177Updated last week
- Efficient baselines for autocurricula in JAX.☆197Updated last year
- A simple library for scaling up JAX programs☆143Updated 11 months ago
- Minimal but scalable implementation of large language models in JAX☆35Updated last month
- Latent Program Network (from the "Searching Latent Program Spaces" paper)☆98Updated 6 months ago
- A set of Python scripts that makes your experience on TPU better☆54Updated 2 weeks ago
- Accelerate, Optimize performance with streamlined training and serving options with JAX.☆311Updated this week
- ☆111Updated this week
- ☆120Updated 3 months ago
- The simplest, fastest repository for training/finetuning medium-sized GPTs.☆164Updated 3 months ago
- Learn online intrinsic rewards from LLM feedback☆43Updated 9 months ago
- Efficient optimizers☆265Updated last week
- ☆187Updated last month
- (Crafter + NetHack) in JAX. ICML 2024 Spotlight.☆336Updated 2 months ago
- Minimal (400 LOC) implementation Maximum (multi-node, FSDP) GPT training☆132Updated last year
- fast + parallel AlphaZero in JAX☆101Updated 9 months ago
- A simple, performant and scalable JAX-based world modeling codebase☆75Updated this week
- Solve puzzles. Learn CUDA.☆63Updated last year
- Synchronized Curriculum Learning for RL Agents☆113Updated last month
- supporting pytorch FSDP for optimizers☆84Updated 9 months ago
- Understand and test language model architectures on synthetic tasks.☆229Updated last week
- LeanRL is a fork of CleanRL, where selected PyTorch scripts optimized for performance using compile and cudagraphs.☆634Updated last month
- Official JAX implementation of xLSTM including fast and efficient training and inference code. 7B model available at https://huggingface.…☆103Updated 8 months ago
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆667Updated this week
- Distributed pretraining of large language models (LLMs) on cloud TPU slices, with Jax and Equinox.☆24Updated last year
- Normalized Transformer (nGPT)☆191Updated 10 months ago
- Jax/Flax rewrite of Karpathy's nanoGPT☆60Updated 2 years ago