mlcommons / algorithmic-efficiencyLinks
MLCommons Algorithmic Efficiency is a benchmark and competition measuring neural network training speedups due to algorithmic improvements in both training algorithms and models.
☆400Updated this week
Alternatives and similar repositories for algorithmic-efficiency
Users that are interested in algorithmic-efficiency are comparing it to the libraries listed below
Sorting:
- For optimization algorithm research and development.☆543Updated last week
- ☆234Updated 8 months ago
- 🧱 Modula software package☆291Updated 2 months ago
- jax-triton contains integrations between JAX and OpenAI Triton☆428Updated last week
- Named tensors with first-class dimensions for PyTorch☆331Updated 2 years ago
- ☆283Updated last year
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆671Updated this week
- ☆456Updated last year
- Puzzles for exploring transformers☆373Updated 2 years ago
- Universal Notation for Tensor Operations in Python.☆438Updated 6 months ago
- CIFAR-10 speedruns: 94% in 2.6 seconds and 96% in 27 seconds☆320Updated 3 months ago
- Pytorch implementation of preconditioned stochastic gradient descent (Kron and affine preconditioner, low-rank approximation precondition…☆188Updated last week
- TensorDict is a pytorch dedicated tensor container.☆975Updated this week
- JAX Synergistic Memory Inspector☆179Updated last year
- CLU lets you write beautiful training loops in JAX.☆356Updated 4 months ago
- Compositional Linear Algebra☆489Updated 2 months ago
- Implementation of Flash Attention in Jax☆219Updated last year
- Efficient optimizers☆275Updated last week
- Annotated version of the Mamba paper☆489Updated last year
- ☆335Updated last month
- A library for unit scaling in PyTorch☆132Updated 3 months ago
- Library for reading and processing ML training data.☆570Updated last week
- Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimenta…☆539Updated last month
- Run PyTorch in JAX. 🤝☆305Updated 2 weeks ago
- A Jax-based library for building transformers, includes implementations of GPT, Gemma, LlaMa, Mixtral, Whisper, SWin, ViT and more.☆295Updated last year
- ☆785Updated last month
- Automatic gradient descent☆215Updated 2 years ago
- JMP is a Mixed Precision library for JAX.☆208Updated 8 months ago
- A simple library for scaling up JAX programs☆144Updated 11 months ago
- Named Tensors for Legible Deep Learning in JAX☆211Updated last week