kach / gradient-descent-the-ultimate-optimizerLinks
Code for our NeurIPS 2022 paper
☆369Updated 2 years ago
Alternatives and similar repositories for gradient-descent-the-ultimate-optimizer
Users that are interested in gradient-descent-the-ultimate-optimizer are comparing it to the libraries listed below
Sorting:
- Named tensors with first-class dimensions for PyTorch☆331Updated 2 years ago
- Implicit MLE: Backpropagating Through Discrete Exponential Family Distributions☆259Updated 2 years ago
- Cockpit: A Practical Debugging Tool for Training Deep Neural Networks☆484Updated 3 years ago
- ☆787Updated last week
- A library to inspect and extract intermediate layers of PyTorch models.☆474Updated 3 years ago
- This library would form a permanent home for reusable components for deep probabilistic programming. The library would form and harness a…☆309Updated 4 months ago
- Laplace approximations for Deep Learning.☆527Updated 6 months ago
- Implementation of https://srush.github.io/annotated-s4☆504Updated 4 months ago
- Code release for "Git Re-Basin: Merging Models modulo Permutation Symmetries"☆497Updated 2 years ago
- Implementation of the Adan (ADAptive Nesterov momentum algorithm) Optimizer in Pytorch☆252Updated 3 years ago
- TorchOpt is an efficient library for differentiable optimization built upon PyTorch.☆617Updated 2 weeks ago
- Constrained optimization toolkit for PyTorch☆701Updated 3 months ago
- D-Adaptation for SGD, Adam and AdaGrad☆525Updated 9 months ago
- Tensors, for human consumption☆1,324Updated 3 weeks ago
- BackPACK - a backpropagation package built on top of PyTorch which efficiently computes quantities other than the gradient.☆599Updated 10 months ago
- Helps you write algorithms in PyTorch that adapt to the available (CUDA) memory