StijnVerdenius / SNIP-it
This repository is the official implementation of the paper Pruning via Iterative Ranking of Sensitivity Statistics and implements novel pruning / compression algorithms for deep learning / neural networks. Amongst others it implements structured pruning before training, its actual parameter shrinking and unstructured before/during training.
☆32Updated last year
Alternatives and similar repositories for SNIP-it:
Users that are interested in SNIP-it are comparing it to the libraries listed below
- Code for "Picking Winning Tickets Before Training by Preserving Gradient Flow" https://openreview.net/pdf?id=SkgsACVKPH☆103Updated 5 years ago
- ☆14Updated 4 years ago
- Soft Threshold Weight Reparameterization for Learnable Sparsity☆88Updated 2 years ago
- Prospect Pruning: Finding Trainable Weights at Initialization Using Meta-Gradients☆31Updated 3 years ago
- Lookahead: A Far-sighted Alternative of Magnitude-based Pruning (ICLR 2020)☆33Updated 4 years ago
- Code for Sanity-Checking Pruning Methods: Random Tickets can Win the Jackpot☆42Updated 4 years ago
- Pytorch implementation of the paper "SNIP: Single-shot Network Pruning based on Connection Sensitivity" by Lee et al.☆107Updated 5 years ago
- Neuron Merging: Compensating for Pruned Neurons (NeurIPS 2020)☆43Updated 4 years ago
- Code for "Online Learned Continual Compression with Adaptive Quantization Modules"☆27Updated 4 years ago
- SNIP: SINGLE-SHOT NETWORK PRUNING BASED ON CONNECTION SENSITIVITY☆113Updated 5 years ago
- Implementation of Effective Sparsification of Neural Networks with Global Sparsity Constraint☆31Updated 3 years ago
- ☆189Updated 4 years ago
- Code for paper "Orthogonal Convolutional Neural Networks".☆116Updated 3 years ago
- This repository contains code to replicate the experiments given in NeurIPS 2019 paper "One ticket to win them all: generalizing lottery …