mil-ad / snipLinks
Pytorch implementation of the paper "SNIP: Single-shot Network Pruning based on Connection Sensitivity" by Lee et al.
☆111Updated 6 years ago
Alternatives and similar repositories for snip
Users that are interested in snip are comparing it to the libraries listed below
Sorting:
- SNIP: SINGLE-SHOT NETWORK PRUNING BASED ON CONNECTION SENSITIVITY☆115Updated 6 years ago
- ☆227Updated last year
- Soft Threshold Weight Reparameterization for Learnable Sparsity☆90Updated 2 years ago
- Code for "Picking Winning Tickets Before Training by Preserving Gradient Flow" https://openreview.net/pdf?id=SkgsACVKPH☆105Updated 5 years ago
- Code release for "Adversarial Robustness vs Model Compression, or Both?"☆90Updated 4 years ago
- Pruning Neural Networks with Taylor criterion in Pytorch☆319Updated 6 years ago
- Code for our ICML'2020 paper "Stabilizing Differentiable Architecture Search via Perturbation-based Regularization"☆76Updated 4 years ago
- [ICLR2021 Outstanding Paper] Rethinking Architecture Selection in Differentiable NAS☆104Updated 3 years ago
- Learning both Weights and Connections for Efficient Neural Networks https://arxiv.org/abs/1506.02626☆179Updated 3 years ago
- code for "AttentiveNAS Improving Neural Architecture Search via Attentive Sampling"☆105Updated 4 years ago
- TPAMI 2021: NATS-Bench: Benchmarking NAS Algorithms for Architecture Topology and Size☆186Updated 3 years ago
- Implementation of Continuous Sparsification, a method for pruning and ticket search in deep networks☆33Updated 3 years ago
- a pytorch implement of mobileNet v2 on cifar10☆64Updated 2 years ago
- [Neurips 2021] Sparse Training via Boosting Pruning Plasticity with Neuroregeneration☆31Updated 2 years ago
- SNIP: SINGLE-SHOT NETWORK PRUNING☆31Updated 8 months ago
- [ICLR-2020] Dynamic Sparse Training: Find Efficient Sparse Network From Scratch With Trainable Masked Layers.☆31Updated 5 years ago
- pytorch-tiny-imagenet☆186Updated last month
- DeepHoyer: Learning Sparser Neural Network with Differentiable Scale-Invariant Sparsity Measures☆32Updated 5 years ago
- [NeurIPS'2019] Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu, “Model Compression with Adversarial Robustness: …☆49Updated 3 years ago
- [ICLR'21] Neural Pruning via Growing Regularization (PyTorch)☆82Updated 4 years ago
- Codes for Layer-wise Optimal Brain Surgeon☆78Updated 6 years ago
- PyTorch port of "Efficient Neural Architecture Search via Parameters Sharing"☆53Updated 7 years ago
- [ICML 2021] "Do We Actually Need Dense Over-Parameterization? In-Time Over-Parameterization in Sparse Training" by Shiwei Liu, Lu Yin, De…☆45Updated 2 years ago
- Comparison of method "Pruning at initialization prior to training" (Synflow/SNIP/GraSP) in PyTorch☆17Updated last year
- This repository contains a Pytorch implementation of the paper "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks"…☆336Updated 2 years ago
- ☆67Updated 5 years ago
- Model compression by constrained optimization, using the Learning-Compression (LC) algorithm☆72Updated 3 years ago
- PyTorch library to facilitate development and standardized evaluation of neural network pruning methods.☆431Updated 2 years ago
- [CVPR 2020] When NAS Meets Robustness: In Search of Robust Architectures against Adversarial Attacks☆124Updated 5 years ago
- Code accompanying the NeurIPS 2020 paper: WoodFisher (Singh & Alistarh, 2020)☆53Updated 4 years ago