facebookresearch / open_lthLinks
A repository in preparation for open-sourcing lottery ticket hypothesis code.
☆632Updated 3 years ago
Alternatives and similar repositories for open_lth
Users that are interested in open_lth are comparing it to the libraries listed below
Sorting:
- End-to-end training of sparse deep neural networks with little-to-no performance loss.☆331Updated 2 years ago
- Sparse learning library and sparse momentum resources.☆384Updated 3 years ago
- This repository contains a Pytorch implementation of the paper "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks"…☆336Updated 2 years ago
- Efficient PyTorch Hessian eigendecomposition tools!☆382Updated last year
- PyTorch library to facilitate development and standardized evaluation of neural network pruning methods.☆432Updated 2 years ago
- ☆144Updated 2 years ago
- ☆227Updated last year
- Code for Neural Architecture Search without Training (ICML 2021)☆473Updated 4 years ago
- ☆194Updated 4 years ago
- A reimplementation of "The Lottery Ticket Hypothesis" (Frankle and Carbin) on MNIST.☆722Updated 5 years ago
- PyHessian is a Pytorch library for second-order based analysis and training of Neural Networks☆759Updated 4 months ago
- BackPACK - a backpropagation package built on top of PyTorch which efficiently computes quantities other than the gradient.☆601Updated last week
- Fast Block Sparse Matrices for Pytorch☆550Updated 4 years ago
- ☆471Updated last month
- Gradient based Hyperparameter Tuning library in PyTorch☆290Updated 5 years ago
- ☆157Updated 3 years ago
- Cockpit: A Practical Debugging Tool for Training Deep Neural Networks☆484Updated 3 years ago
- Naszilla is a Python library for neural architecture search (NAS)☆314Updated 2 years ago
- NASBench: A Neural Architecture Search Dataset and Benchmark☆708Updated 2 years ago
- ☆612Updated this week
- [ICLR 2020] Drawing Early-Bird Tickets: Toward More Efficient Training of Deep Networks☆140Updated 5 years ago
- ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning☆281Updated 2 years ago
- Mode Connectivity and Fast Geometric Ensembles in PyTorch☆280Updated 3 years ago
- Reduce end to end training time from days to hours (or hours to minutes), and energy requirements/costs by an order of magnitude using co…☆343Updated 2 years ago
- Project site for "Your Classifier is Secretly an Energy-Based Model and You Should Treat it Like One"☆425Updated 3 years ago
- Implementation of Estimating Training Data Influence by Tracing Gradient Descent (NeurIPS 2020)☆237Updated 3 years ago
- Unit Testing for pytorch, based on mltest☆312Updated 5 years ago
- This repository contains the results for the paper: "Descending through a Crowded Valley - Benchmarking Deep Learning Optimizers"☆184Updated 4 years ago
- Code for Parameter Prediction for Unseen Deep Architectures (NeurIPS 2021)☆492Updated 2 years ago
- higher is a pytorch library allowing users to obtain higher order gradients over losses spanning training loops rather than individual tr…☆1,627Updated 3 years ago