GATECH-EIC / Early-Bird-TicketsLinks
[ICLR 2020] Drawing Early-Bird Tickets: Toward More Efficient Training of Deep Networks
☆138Updated 4 years ago
Alternatives and similar repositories for Early-Bird-Tickets
Users that are interested in Early-Bird-Tickets are comparing it to the libraries listed below
Sorting:
- Code for "Picking Winning Tickets Before Training by Preserving Gradient Flow" https://openreview.net/pdf?id=SkgsACVKPH☆104Updated 5 years ago
- ☆226Updated 10 months ago
- ☆70Updated 5 years ago
- Code for paper "SWALP: Stochastic Weight Averaging forLow-Precision Training".☆62Updated 6 years ago
- ☆191Updated 4 years ago
- [ICLR 2021] "Neural Architecture Search on ImageNet in Four GPU Hours: A Theoretically Inspired Perspective" by Wuyang Chen, Xinyu Gong, …☆168Updated 3 years ago
- A research library for pytorch-based neural network pruning, compression, and more.☆162Updated 2 years ago
- SNIP: SINGLE-SHOT NETWORK PRUNING☆30Updated 2 months ago
- ☆144Updated 2 years ago
- [ICLR 2020] NAS evaluation is frustratingly hard☆149Updated last year
- Soft Threshold Weight Reparameterization for Learnable Sparsity☆90Updated 2 years ago
- code for "AttentiveNAS Improving Neural Architecture Search via Attentive Sampling"☆104Updated 3 years ago
- Code release for paper "Random Search and Reproducibility for NAS"☆167Updated 5 years ago
- Neural Architecture Transfer (Arxiv'20), PyTorch Implementation☆156Updated 5 years ago
- [ICLR 2021 Spotlight] "CPT: Efficient Deep Neural Network Training via Cyclic Precision" by Yonggan Fu, Han Guo, Meng Li, Xin Yang, Yinin…☆31Updated last year
- ☆67Updated 4 years ago
- SNIP: SINGLE-SHOT NETWORK PRUNING BASED ON CONNECTION SENSITIVITY☆114Updated 5 years ago
- Discovering Neural Wirings (https://arxiv.org/abs/1906.00586)☆137Updated 5 years ago
- Identify a binary weight or binary weight and activation subnetwork within a randomly initialized network by only pruning and binarizing …☆52Updated 3 years ago
- Pytorch implementation of the paper "SNIP: Single-shot Network Pruning based on Connection Sensitivity" by Lee et al.☆108Updated 6 years ago
- End-to-end training of sparse deep neural networks with little-to-no performance loss.☆322Updated 2 years ago
- ☆53Updated 6 years ago
- Pruning Neural Networks with Taylor criterion in Pytorch☆318Updated 5 years ago
- Using ideas from product quantization for state-of-the-art neural network compression.☆145Updated 3 years ago
- This repository contains code to replicate the experiments given in NeurIPS 2019 paper "One ticket to win them all: generalizing lottery …