TimDettmers / sparse_learning
Sparse learning library and sparse momentum resources.
☆379Updated 2 years ago
Related projects ⓘ
Alternatives and complementary repositories for sparse_learning
- PyTorch library to facilitate development and standardized evaluation of neural network pruning methods.☆424Updated last year
- End-to-end training of sparse deep neural networks with little-to-no performance loss.☆317Updated last year
- PyTorch layer-by-layer model profiler☆608Updated 3 years ago
- A repository in preparation for open-sourcing lottery ticket hypothesis code.☆627Updated 2 years ago
- Fast Block Sparse Matrices for Pytorch☆545Updated 3 years ago
- ☆219Updated 3 months ago
- ☆143Updated last year
- Implements quantized distillation. Code for our paper "Model compression via distillation and quantization"☆329Updated 3 months ago
- A reimplementation of "The Lottery Ticket Hypothesis" (Frankle and Carbin) on MNIST.☆709Updated 4 years ago
- Single-Path NAS: Designing Hardware-Efficient ConvNets in less than 4 Hours☆396Updated 3 years ago
- Papers for deep neural network compression and acceleration☆396Updated 3 years ago
- Library for faster pinned CPU <-> GPU transfer in Pytorch☆683Updated 4 years ago
- A Re-implementation of Fixed-update Initialization☆151Updated 5 years ago
- Implementations of ideas from recent papers☆391Updated 3 years ago
- ConvNet training using pytorch☆347Updated 3 years ago
- Code release for paper "Random Search and Reproducibility for NAS"☆167Updated 5 years ago
- Naszilla is a Python library for neural architecture search (NAS)☆304Updated last year
- Discovering Neural Wirings (https://arxiv.org/abs/1906.00586)☆138Updated 4 years ago
- [ICLR 2020] Drawing Early-Bird Tickets: Toward More Efficient Training of Deep Networks☆137Updated 4 years ago
- PyTorch Implementation of Weights Pruning☆184Updated 6 years ago
- Code for the NuerIPS'19 paper "Gate Decorator: Global Filter Pruning Method for Accelerating Deep Convolutional Neural Networks"