LucasPrietoAl / grokking-at-the-edge-of-numerical-stability
☆84Updated last month
Alternatives and similar repositories for grokking-at-the-edge-of-numerical-stability:
Users that are interested in grokking-at-the-edge-of-numerical-stability are comparing it to the libraries listed below
- Tree Attention: Topology-aware Decoding for Long-Context Attention on GPU clusters☆116Updated 2 months ago
- ☆53Updated last year
- ☆78Updated 10 months ago
- Token Omission Via Attention☆123Updated 4 months ago
- supporting pytorch FSDP for optimizers☆76Updated 2 months ago
- A MAD laboratory to improve AI architecture designs 🧪☆103Updated 2 months ago
- ☆71Updated 6 months ago
- Explorations into the proposal from the paper "Grokfast, Accelerated Grokking by Amplifying Slow Gradients"☆95Updated last month
- The simplest, fastest repository for training/finetuning medium-sized GPTs.☆95Updated 3 months ago
- Understand and test language model architectures on synthetic tasks.☆181Updated last month
- PyTorch implementation of models from the Zamba2 series.☆176Updated 3 weeks ago
- Mixture of A Million Experts☆40Updated 6 months ago
- Collection of autoregressive model implementation☆81Updated last week
- A byte-level decoder architecture that matches the performance of tokenized Transformers.☆65Updated 9 months ago
- An easy-to-understand framework for LLM samplers that rewind and revise generated tokens☆130Updated this week
- ☆75Updated 7 months ago
- Normalized Transformer (nGPT)☆152Updated 3 months ago
- Memory Mosaics are networks of associative memories working in concert to achieve a prediction task.☆39Updated 3 weeks ago
- ☆49Updated 11 months ago
- Pytorch implementation of the PEER block from the paper, Mixture of A Million Experts, by Xu Owen He at Deepmind☆118Updated 5 months ago
- DeMo: Decoupled Momentum Optimization☆181Updated 2 months ago
- ☆79Updated 3 months ago
- ☆51Updated 9 months ago
- This repo is based on https://github.com/jiaweizzhao/GaLore☆24Updated 5 months ago
- ☆47Updated 5 months ago
- ☆44Updated 3 months ago
- GoldFinch and other hybrid transformer components☆43Updated 7 months ago
- The simplest, fastest repository for training/finetuning medium-sized xLSTMs.☆39Updated 8 months ago