leloykun / modded-nanogpt
NanoGPT (124M) quality in 2.67B tokens
☆24Updated this week
Alternatives and similar repositories for modded-nanogpt:
Users that are interested in modded-nanogpt are comparing it to the libraries listed below
- Collection of autoregressive model implementation☆76Updated this week
- Train a SmolLM-style llm on fineweb-edu in JAX/Flax with an assortment of optimizers.☆14Updated last week
- ☆46Updated 2 months ago
- ☆69Updated 4 months ago
- Train, tune, and infer Bamba model☆73Updated this week
- A repository for research on medium sized language models.☆76Updated 7 months ago
- ☆40Updated 11 months ago
- ☆37Updated 5 months ago
- DPO, but faster 🚀☆29Updated last month
- Tree Attention: Topology-aware Decoding for Long-Context Attention on GPU clusters☆109Updated last month
- RWKV-7: Surpassing GPT☆68Updated last month
- [WIP] Transformer to embed Danbooru labelsets☆13Updated 9 months ago
- ☆49Updated 9 months ago
- My fork os allen AI's OLMo for educational purposes.☆30Updated last month
- Official repository for the paper "Approximating Two-Layer Feedforward Networks for Efficient Transformers"☆36Updated last year
- GoldFinch and other hybrid transformer components☆42Updated 5 months ago
- ☆48Updated 4 months ago
- Small, simple agent task environments for training and evaluation☆18Updated 2 months ago
- ☆26Updated 9 months ago
- ☆78Updated 8 months ago
- Maya: An Instruction Finetuned Multilingual Multimodal Model using Aya☆98Updated this week
- Using multiple LLMs for ensemble Forecasting☆16Updated 11 months ago
- Make triton easier☆42Updated 6 months ago
- Data preparation code for CrystalCoder 7B LLM☆43Updated 8 months ago
- Anchored Preference Optimization and Contrastive Revisions: Addressing Underspecification in Alignment☆53Updated 4 months ago
- ☆75Updated 6 months ago
- ☆62Updated 3 months ago
- ☆47Updated 4 months ago
- Pytorch implementation of the PEER block from the paper, Mixture of A Million Experts, by Xu Owen He at Deepmind☆115Updated 4 months ago
- ☆31Updated 6 months ago