cloneofsimo / ezmupLinks
Simple implementation of muP, based on Spectral Condition for Feature Learning. The implementation is SGD only, dont use it for Adam
☆85Updated last year
Alternatives and similar repositories for ezmup
Users that are interested in ezmup are comparing it to the libraries listed below
Sorting:
- These papers will provide unique insightful concepts that will broaden your perspective on neural networks and deep learning☆48Updated 2 years ago
- WIP☆94Updated last year
- ☆53Updated last year
- supporting pytorch FSDP for optimizers☆84Updated 9 months ago
- Minimal (400 LOC) implementation Maximum (multi-node, FSDP) GPT training☆132Updated last year
- ☆87Updated last year
- ☆34Updated last year
- ☆30Updated 9 months ago
- Tiny re-implementation of MDM in style of LLaDA and nano-gpt speedrun☆56Updated 6 months ago
- Focused on fast experimentation and simplicity☆75Updated 8 months ago
- Mixture of A Million Experts☆47Updated last year
- Yet another random morning idea to be quickly tried and architecture shared if it works; to allow the transformer to pause for any amount…☆53Updated last year
- The simplest, fastest repository for training/finetuning medium-sized GPTs.☆157Updated 2 months ago
- Sparse Autoencoders for Stable Diffusion XL models.☆70Updated last month
- ☆23Updated last year
- ☆65Updated 9 months ago
- ☆53Updated last year
- LoRA for arbitrary JAX models and functions☆142Updated last year
- research impl of Native Sparse Attention (2502.11089)☆61Updated 6 months ago
- Replicating and dissecting the git-re-basin project in one-click-replication Colabs☆36Updated 2 years ago
- Pytorch implementation of the PEER block from the paper, Mixture of A Million Experts, by Xu Owen He at Deepmind☆128Updated last year
- Explorations into the recently proposed Taylor Series Linear Attention☆100Updated last year
- ☆27Updated last year
- ☆57Updated 11 months ago
- Efficient optimizers☆261Updated last month
- ☆106Updated 2 years ago
- A library for unit scaling in PyTorch☆130Updated 2 months ago
- ☆36Updated last week
- ☆53Updated 9 months ago
- Automatically take good care of your preemptible TPUs☆36Updated 2 years ago