formll / dogLinks
DoG is SGD's Best Friend: A Parameter-Free Dynamic Step Size Schedule
☆63Updated last year
Alternatives and similar repositories for dog
Users that are interested in dog are comparing it to the libraries listed below
Sorting:
- Parameter-Free Optimizers for Pytorch☆130Updated last year
- Replicating and dissecting the git-re-basin project in one-click-replication Colabs☆36Updated 2 years ago
- ☆53Updated 8 months ago
- [ICML 2024] SINGD: KFAC-like Structured Inverse-Free Natural Gradient Descent (http://arxiv.org/abs/2312.05705)☆22Updated 7 months ago
- Riemannian Optimization Using JAX☆49Updated last year
- Transformers with doubly stochastic attention☆46Updated 2 years ago
- Unofficial but Efficient Implementation of "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" in JAX☆83Updated last year
- Lightning-like training API for JAX with Flax☆41Updated 6 months ago
- unofficial re-implementation of "Grokking: Generalization Beyond Overfitting on Small Algorithmic Datasets"☆78Updated 2 years ago
- Pytorch implementation of preconditioned stochastic gradient descent (Kron and affine preconditioner, low-rank approximation precondition…☆177Updated 2 weeks ago
- ☆190Updated 6 months ago
- ☆26Updated 2 years ago
- 🧱 Modula software package☆200Updated 2 months ago
- ☆21Updated 3 weeks ago
- LoRA for arbitrary JAX models and functions☆138Updated last year
- Euclidean Wasserstein-2 optimal transportation☆47Updated last year
- nanoGPT-like codebase for LLM training☆98Updated last month
- Fine-grained, dynamic control of neural network topology in JAX.☆21Updated last year
- PyTorch linear operators for curvature matrices (Hessian, Fisher/GGN, KFAC, ...)☆40Updated 2 months ago
- This repository includes code to reproduce the tables in "Loss Landscapes are All You Need: Neural Network Generalization Can Be Explaine…☆37Updated 2 years ago
- Optimization algorithm which fits a ResNet to CIFAR-10 5x faster than SGD / Adam (with terrible generalization)☆14Updated last year
- ☆36Updated last year
- ☆16Updated 9 months ago
- Running Jax in PyTorch Lightning☆102Updated 6 months ago
- ☆32Updated last year
- ☆68Updated 6 months ago
- ☆55Updated 2 months ago
- Implementation of GateLoop Transformer in Pytorch and Jax☆89Updated last year
- This repository contains PyTorch implementations of various random feature maps for dot product kernels.☆21Updated 11 months ago
- Code accompanying our paper "Feature Learning in Infinite-Width Neural Networks" (https://arxiv.org/abs/2011.14522)☆62Updated 4 years ago