facebookresearch / optimizersLinks
For optimization algorithm research and development.
☆558Updated 3 weeks ago
Alternatives and similar repositories for optimizers
Users that are interested in optimizers are comparing it to the libraries listed below
Sorting:
- Annotated version of the Mamba paper☆495Updated last year
- MLCommons Algorithmic Efficiency is a benchmark and competition measuring neural network training speedups due to algorithmic improvement…☆406Updated this week
- ☆314Updated last year
- Efficient optimizers☆281Updated last month
- Implementation of Diffusion Transformer (DiT) in JAX☆306Updated last year
- TensorDict is a pytorch dedicated tensor container.☆1,003Updated last week
- ☆289Updated last year
- Scalable and Performant Data Loading☆364Updated this week
- Universal Notation for Tensor Operations in Python.☆464Updated 9 months ago
- CIFAR-10 speedruns: 94% in 2.6 seconds and 96% in 27 seconds☆349Updated 2 months ago
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆693Updated last week
- The AdEMAMix Optimizer: Better, Faster, Older.☆186Updated last year
- 🧱 Modula software package☆322Updated 5 months ago
- A Jax-based library for building transformers, includes implementations of GPT, Gemma, LlaMa, Mixtral, Whisper, SWin, ViT and more.☆298Updated last year
- Named tensors with first-class dimensions for PyTorch☆332Updated 2 years ago
- A subset of PyTorch's neural network modules, written in Python using OpenAI's Triton.☆596Updated 5 months ago
- An implementation of PSGD Kron second-order optimizer for PyTorch☆98Updated 6 months ago
- ☆490Updated last year
- Best practices & guides on how to write distributed pytorch training code☆575Updated 3 months ago
- ☆246Updated last year
- Library for reading and processing ML training data.☆677Updated this week
- ☆215Updated last year
- Implementation of Flash Attention in Jax☆225Updated last year
- Puzzles for exploring transformers☆384Updated 2 years ago
- Official Implementation of "ADOPT: Modified Adam Can Converge with Any β2 with the Optimal Rate"☆434Updated last year
- jax-triton contains integrations between JAX and OpenAI Triton☆437Updated last month
- Modular, scalable library to train ML models☆203Updated last week
- Accelerated First Order Parallel Associative Scan☆196Updated 3 weeks ago
- Dion optimizer algorithm☆424Updated 2 weeks ago
- Implementation of https://srush.github.io/annotated-s4☆512Updated 7 months ago