HazyResearch / train-tk
train with kittens!
☆49Updated 2 weeks ago
Related projects ⓘ
Alternatives and complementary repositories for train-tk
- Simple and efficient pytorch-native transformer training and inference (batched)☆61Updated 7 months ago
- Experiment of using Tangent to autodiff triton☆72Updated 9 months ago
- ☆50Updated last month
- Tree Attention: Topology-aware Decoding for Long-Context Attention on GPU clusters☆104Updated last month
- Experiments for efforts to train a new and improved t5☆76Updated 6 months ago
- ☆76Updated 5 months ago
- A MAD laboratory to improve AI architecture designs 🧪☆95Updated 6 months ago
- ☆72Updated 4 months ago
- The simplest, fastest repository for training/finetuning medium-sized GPTs.☆84Updated last week
- ☆53Updated 9 months ago
- ☆39Updated 9 months ago
- seqax = sequence modeling + JAX☆132Updated 3 months ago
- Make triton easier☆41Updated 5 months ago
- ☆50Updated 5 months ago
- ☆49Updated 7 months ago
- ☆27Updated 4 months ago
- GoldFinch and other hybrid transformer components☆39Updated 3 months ago
- Token Omission Via Attention☆119Updated 3 weeks ago
- ☆36Updated 3 months ago
- Collection of autoregressive model implementation☆66Updated last week
- Scalable neural net training via automatic normalization in the modular norm.☆119Updated 2 months ago
- Using FlexAttention to compute attention with different masking patterns☆40Updated last month
- Cold Compress is a hackable, lightweight, and open-source toolkit for creating and benchmarking cache compression methods built on top of…☆86Updated 3 months ago
- Code for exploring Based models from "Simple linear attention language models balance the recall-throughput tradeoff"☆213Updated 2 months ago
- ☆55Updated 11 months ago
- ☆61Updated 2 months ago
- A place to store reusable transformer components of my own creation or found on the interwebs☆43Updated last week
- gzip Predicts Data-dependent Scaling Laws☆32Updated 5 months ago
- ☆20Updated last year
- Muon optimizer for neural networks: >30% extra sample efficiency, <3% wallclock overhead☆69Updated this week