awslabs / adatuneLinks
Gradient based Hyperparameter Tuning library in PyTorch
☆290Updated 4 years ago
Alternatives and similar repositories for adatune
Users that are interested in adatune are comparing it to the libraries listed below
Sorting:
- End-to-end training of sparse deep neural networks with little-to-no performance loss.☆323Updated 2 years ago
- Implementation for the Lookahead Optimizer.☆241Updated 3 years ago
- Unit Testing for pytorch, based on mltest☆310Updated 4 years ago
- This repository contains the results for the paper: "Descending through a Crowded Valley - Benchmarking Deep Learning Optimizers"☆180Updated 3 years ago
- Official code for the Stochastic Polyak step-size optimizer☆139Updated last year
- Python implementation of GLN in different frameworks☆97Updated 4 years ago
- PyTorch functions and utilities to make your life easier☆195Updated 4 years ago
- Shape and dimension inference (Keras-like) for PyTorch layers and neural networks☆571Updated 3 years ago
- A PyTorch implementation of Neighbourhood Components Analysis.☆399Updated 4 years ago
- ☆470Updated 2 months ago
- Cockpit: A Practical Debugging Tool for Training Deep Neural Networks☆480Updated 3 years ago
- PyTorch dataset extended with map, cache etc. (tensorflow.data like)☆330Updated 3 years ago
- 🧀 Pytorch code for the Fromage optimiser.☆124Updated last year
- ☆144Updated 2 years ago
- Fast Block Sparse Matrices for Pytorch☆548Updated 4 years ago
- A repository in preparation for open-sourcing lottery ticket hypothesis code.☆632Updated 2 years ago
- Keras implementation of Legendre Memory Units☆215Updated this week
- Implements stochastic line search☆118Updated 2 years ago
- ☆376Updated last year
- Sparse learning library and sparse momentum resources.☆381Updated 3 years ago
- RLStructures is a library to facilitate the implementation of new reinforcement learning algorithms. It includes a library, a tutorial, a…☆260Updated 2 years ago
- Asynchronous Distributed Hyperparameter Optimization.☆294Updated 5 months ago
- Implementation of Estimating Training Data Influence by Tracing Gradient Descent (NeurIPS 2020)☆233Updated 3 years ago
- To Run, Manage and Visualize Large Scale Experiments☆171Updated last year
- ☆78Updated 5 years ago
- BackPACK - a backpropagation package built on top of PyTorch which efficiently computes quantities other than the gradient.☆588Updated 6 months ago
- Code for Neural Architecture Search without Training (ICML 2021)☆470Updated 3 years ago
- Code for Parameter Prediction for Unseen Deep Architectures (NeurIPS 2021)☆491Updated 2 years ago
- Deal with bad samples in your dataset dynamically, use Transforms as Filters, and more!☆377Updated 2 years ago
- ☆153Updated 5 years ago