iShohei220 / adoptLinks
Official Implementation of "ADOPT: Modified Adam Can Converge with Any β2 with the Optimal Rate"
☆425Updated 9 months ago
Alternatives and similar repositories for adopt
Users that are interested in adopt are comparing it to the libraries listed below
Sorting:
- The AdEMAMix Optimizer: Better, Faster, Older.☆186Updated last year
- For optimization algorithm research and development.☆539Updated 2 weeks ago
- Library for Jacobian descent with PyTorch. It enables the optimization of neural networks with multiple losses (e.g. multi-task learning)…☆271Updated this week
- Getting crystal-like representations with harmonic loss☆194Updated 6 months ago
- Efficient optimizers☆265Updated last week
- CIFAR-10 speedruns: 94% in 2.6 seconds and 96% in 27 seconds☆303Updated 2 months ago
- Annotated version of the Mamba paper☆489Updated last year
- ☆308Updated last year
- TensorDict is a pytorch dedicated tensor container.☆970Updated this week
- TensorHue is a Python library that allows you to visualize tensors right in your console, making understanding and debugging tensor conte…☆120Updated 7 months ago
- ☆150Updated last year
- When it comes to optimizers, it's always better to be safe than sorry☆373Updated 2 weeks ago
- D-Adaptation for SGD, Adam and AdaGrad☆525Updated 8 months ago
- Implementation of Diffusion Transformer (DiT) in JAX☆292Updated last year
- Scalable and Performant Data Loading☆304Updated this week
- Simple, minimal implementation of the Mamba SSM in one pytorch file. Using logcumsumexp (Heisen sequence).☆122Updated 11 months ago
- Implementation of the proposed minGRU in Pytorch☆306Updated 6 months ago
- Just some miscellaneous utility functions / decorators / modules related to Pytorch and Accelerate to help speed up implementation of new…☆123Updated last year
- Schedule-Free Optimization in PyTorch☆2,217Updated 4 months ago
- ☆216Updated 10 months ago
- Kolmogorov-Arnold Networks (KAN) using Chebyshev polynomials instead of B-splines.☆392Updated last year
- optimizer & lr scheduler & loss function collections in PyTorch☆361Updated last week
- An implementation of PSGD Kron second-order optimizer for PyTorch☆95Updated 2 months ago
- Quick implementation of nGPT, learning entirely on the hypersphere, from NvidiaAI☆291Updated 4 months ago
- Universal Notation for Tensor Operations in Python.☆433Updated 6 months ago
- Official repository for the paper "Grokfast: Accelerated Grokking by Amplifying Slow Gradients"☆562Updated last year
- Simple and readable code for training and sampling from diffusion models☆624Updated 3 months ago
- Implementation of the Adan (ADAptive Nesterov momentum algorithm) Optimizer in Pytorch☆252Updated 3 years ago
- Transform datasets at scale. Optimize datasets for fast AI model training.☆543Updated last week
- Code for Adam-mini: Use Fewer Learning Rates To Gain More https://arxiv.org/abs/2406.16793☆437Updated 4 months ago