facebookresearch / open_lthLinks
A repository in preparation for open-sourcing lottery ticket hypothesis code.
☆635Updated 3 years ago
Alternatives and similar repositories for open_lth
Users that are interested in open_lth are comparing it to the libraries listed below
Sorting:
- End-to-end training of sparse deep neural networks with little-to-no performance loss.☆335Updated 3 years ago
- Sparse learning library and sparse momentum resources.☆385Updated 3 years ago
- Efficient PyTorch Hessian eigendecomposition tools!☆384Updated last year
- ☆228Updated last year
- This repository contains a Pytorch implementation of the paper "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks"…☆342Updated 2 years ago
- ☆145Updated 2 years ago
- ☆194Updated this week
- PyTorch library to facilitate development and standardized evaluation of neural network pruning methods.☆432Updated 2 years ago
- PyHessian is a Pytorch library for second-order based analysis and training of Neural Networks☆773Updated 7 months ago
- A reimplementation of "The Lottery Ticket Hypothesis" (Frankle and Carbin) on MNIST.☆725Updated 5 years ago
- BackPACK - a backpropagation package built on top of PyTorch which efficiently computes quantities other than the gradient.☆604Updated 2 months ago
- ☆472Updated last week
- Fast Block Sparse Matrices for Pytorch☆550Updated 5 years ago
- Gradient based Hyperparameter Tuning library in PyTorch☆291Updated 5 years ago
- ☆159Updated 3 years ago
- Code for Neural Architecture Search without Training (ICML 2021)☆474Updated 4 years ago
- Cockpit: A Practical Debugging Tool for Training Deep Neural Networks☆487Updated 3 years ago
- ☆619Updated last month
- higher is a pytorch library allowing users to obtain higher order gradients over losses spanning training loops rather than individual tr…☆1,629Updated 3 years ago
- ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning☆283Updated 2 years ago
- Mode Connectivity and Fast Geometric Ensembles in PyTorch☆283Updated 3 years ago
- This repository contains the results for the paper: "Descending through a Crowded Valley - Benchmarking Deep Learning Optimizers"☆184Updated 4 years ago
- A drop-in replacement for CIFAR-10.☆247Updated 4 years ago
- ☆539Updated 4 years ago
- Naszilla is a Python library for neural architecture search (NAS)☆316Updated 3 years ago
- [ICLR 2020] Drawing Early-Bird Tickets: Toward More Efficient Training of Deep Networks☆140Updated 5 years ago
- PyTorch layer-by-layer model profiler☆607Updated 4 years ago
- Project site for "Your Classifier is Secretly an Energy-Based Model and You Should Treat it Like One"☆430Updated 3 years ago
- Implementation of Estimating Training Data Influence by Tracing Gradient Descent (NeurIPS 2020)☆239Updated 3 years ago
- Implementation for the Lookahead Optimizer.☆244Updated 3 years ago