A pytorch implement of scalable neural netowrks.
☆23Jun 9, 2020Updated 5 years ago
Alternatives and similar repositories for pytorch-scalable-neural-networks
Users that are interested in pytorch-scalable-neural-networks are comparing it to the libraries listed below
Sorting:
- ☆23Oct 27, 2019Updated 6 years ago
- ☆128Nov 2, 2020Updated 5 years ago
- ☆10Jul 5, 2019Updated 6 years ago
- ☆10May 9, 2019Updated 6 years ago
- [IROS 2021] ADD: A Fine-grained Dynamic Inference Architecture for Semantic Image Segmentation☆10May 3, 2022Updated 3 years ago
- Official codebase for our paper "Joslim: Joint Widths and Weights Optimization for Slimmable Neural Networks"☆12Jun 30, 2021Updated 4 years ago
- [ICASSP-2021] Official implementations of Multi-View Contrastive Learning for Online Knowledge Distillation (MCL-OKD)☆27Apr 7, 2021Updated 4 years ago
- Hybrid Binary Networks: Optimizing for Accuracy, Efficiency and Memory - WACV18☆12Jul 5, 2019Updated 6 years ago
- ☆15Jan 8, 2020Updated 6 years ago
- [ICML2024] DetKDS: Knowledge Distillation Search for Object Detectors☆19Jul 11, 2024Updated last year
- Codes for DATA: Differentiable ArchiTecture Approximation.☆11Jul 22, 2021Updated 4 years ago
- Implementation of ENAS for CNNs on CIFAR 10☆11Oct 13, 2019Updated 6 years ago
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆109Jun 18, 2020Updated 5 years ago
- Codes for paper "Few Shot Network Compression via Cross Distillation", AAAI 2020.☆31Jan 31, 2020Updated 6 years ago
- Successfully training approximations to full-rank matrices for efficiency in deep learning.☆17Jan 5, 2021Updated 5 years ago
- A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', …☆181Jan 29, 2022Updated 4 years ago
- Code for LIT, ICML 2019☆20Jun 11, 2019Updated 6 years ago
- This is the official repository for Batch Level Distillation (BLD)☆15Jan 25, 2021Updated 5 years ago
- Codes for ECCV2022 paper - contrastive deep supervision☆69Sep 18, 2022Updated 3 years ago
- Source Code for ICML 2019 Paper "Shallow-Deep Networks: Understanding and Mitigating Network Overthinking"☆37Dec 22, 2023Updated 2 years ago
- (Unofficial) Data-Distortion Guided Self-Distillation for Deep Neural Networks (AAAI 2019)☆14May 12, 2021Updated 4 years ago
- [NeurIPS 2019] E2-Train: Training State-of-the-art CNNs with Over 80% Less Energy☆21Nov 18, 2019Updated 6 years ago
- Distilling Knowledge via Intermediate Classifiers☆16Oct 3, 2021Updated 4 years ago
- Unofficial implementation of Stand-Alone Self-Attention in Vision Models (obsolete)☆44Jul 1, 2019Updated 6 years ago
- Code for paper "Energy-Constrained Compression for Deep Neural Networks via Weighted Sparse Projection and Layer Input Masking"☆18May 7, 2019Updated 6 years ago
- ☆19May 28, 2020Updated 5 years ago
- This is the implementaion of paper "Task-Oriented Feature Distillation"☆43Apr 25, 2022Updated 3 years ago
- MSDNet☆190Jan 7, 2022Updated 4 years ago
- Code for the paper "Training CNNs with Selective Allocation of Channels" (ICML 2019)☆25May 14, 2019Updated 6 years ago
- [NeurIPS'2019] Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu, “Model Compression with Adversarial Robustness: …☆49Dec 30, 2021Updated 4 years ago
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆103Apr 30, 2024Updated last year
- Cheap distillation for convolutional neural networks.☆35Oct 22, 2018Updated 7 years ago
- ☆61Apr 24, 2020Updated 5 years ago
- ☆27Dec 13, 2022Updated 3 years ago
- Fast NPU-aware Neural Architecture Search☆21Sep 6, 2021Updated 4 years ago
- Zero-Shot Knowledge Distillation in Deep Networks☆67Apr 16, 2022Updated 3 years ago
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆30Aug 19, 2020Updated 5 years ago
- Knowledge distillation from Ensembles of Iterative pruning (BMVC 2020)☆25Aug 13, 2020Updated 5 years ago
- Code for "Online Learned Continual Compression with Adaptive Quantization Modules"☆27Nov 11, 2020Updated 5 years ago