facebookresearch / open_lth
A repository in preparation for open-sourcing lottery ticket hypothesis code.
☆627Updated 2 years ago
Related projects ⓘ
Alternatives and complementary repositories for open_lth
- End-to-end training of sparse deep neural networks with little-to-no performance loss.☆317Updated last year
- This repository contains a Pytorch implementation of the paper "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks"…☆325Updated last year
- PyTorch library to facilitate development and standardized evaluation of neural network pruning methods.☆424Updated last year
- ☆219Updated 3 months ago
- ☆143Updated last year
- Fast Block Sparse Matrices for Pytorch☆545Updated 3 years ago
- Sparse learning library and sparse momentum resources.☆379Updated 2 years ago
- ☆186Updated 3 years ago
- PyHessian is a Pytorch library for second-order based analysis and training of Neural Networks☆687Updated 7 months ago
- A reimplementation of "The Lottery Ticket Hypothesis" (Frankle and Carbin) on MNIST.☆709Updated 4 years ago
- PyTorch layer-by-layer model profiler☆608Updated 3 years ago
- Efficient PyTorch Hessian eigendecomposition tools!☆364Updated 8 months ago
- ☆153Updated 2 years ago
- BackPACK - a backpropagation package built on top of PyTorch which efficiently computes quantities other than the gradient.☆561Updated this week
- ☆466Updated 3 months ago
- A research library for pytorch-based neural network pruning, compression, and more.☆160Updated last year
- Mode Connectivity and Fast Geometric Ensembles in PyTorch☆265Updated 2 years ago
- Code for Neural Architecture Search without Training (ICML 2021)☆460Updated 3 years ago
- A drop-in replacement for CIFAR-10.☆236Updated 3 years ago
- [ICLR 2020] Drawing Early-Bird Tickets: Toward More Efficient Training of Deep Networks☆137Updated 4 years ago
- Naszilla is a Python library for neural architecture search (NAS)☆304Updated last year
- Cockpit: A Practical Debugging Tool for Training Deep Neural Networks☆473Updated 2 years ago
- higher is a pytorch library allowing users to obtain higher order gradients over losses spanning training loops rather than individual tr…☆1,593Updated 2 years ago
- NASBench: A Neural Architecture Search Dataset and Benchmark☆685Updated last year
- Understanding Training Dynamics of Deep ReLU Networks☆279Updated 3 weeks ago
- ☆532Updated 2 years ago
- Totally Versatile Miscellanea for Pytorch☆468Updated 2 years ago
- ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning☆266Updated last year
- A LARS implementation in PyTorch☆335Updated 4 years ago
- Gradient based Hyperparameter Tuning library in PyTorch☆289Updated 4 years ago