aryanasadianuoit / Distilling-Knowledge-via-Intermediate-Classifiers
Distilling Knowledge via Intermediate Classifiers
☆15Updated 3 years ago
Alternatives and similar repositories for Distilling-Knowledge-via-Intermediate-Classifiers
Users that are interested in Distilling-Knowledge-via-Intermediate-Classifiers are comparing it to the libraries listed below
Sorting:
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆74Updated last year
- The implementation of AAAI 2021 Paper: "Progressive Network Grafting for Few-Shot Knowledge Distillation".☆32Updated 9 months ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆48Updated 2 years ago
- Graph Knowledge Distillation☆13Updated 5 years ago
- ☆26Updated 4 years ago
- Data-Free Network Quantization With Adversarial Knowledge Distillation PyTorch☆29Updated 3 years ago
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆75Updated 9 months ago
- [ICLR-2020] Dynamic Sparse Training: Find Efficient Sparse Network From Scratch With Trainable Masked Layers.☆31Updated 5 years ago
- Codes for paper "Few Shot Network Compression via Cross Distillation", AAAI 2020.☆31Updated 5 years ago
- [IJCAI'22 Survey] Recent Advances on Neural Network Pruning at Initialization.☆59Updated last year
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆40Updated 2 years ago
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆117Updated 4 years ago
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆28Updated 3 years ago
- Demonstration of transfer of knowledge and generalization with distillation☆53Updated 6 years ago
- ☆22Updated 4 years ago
- ZSKD with PyTorch☆30Updated last year
- A generic code base for neural network pruning, especially for pruning at initialization.☆30Updated 2 years ago
- Codes for ECCV2020 paper "Improving Knowledge Distillation via Category Structure".☆10Updated 4 years ago
- Pruning Filters For Efficient ConvNets, PyTorch Implementation.☆30Updated 5 years ago
- PyTorch implementation for Channel Distillation☆100Updated 4 years ago
- [ICLR'23] Trainability Preserving Neural Pruning (PyTorch)☆33Updated last year
- [ICLR 2022] "Learning Pruning-Friendly Networks via Frank-Wolfe: One-Shot, Any-Sparsity, and No Retraining" by Lu Miao*, Xiaolong Luo*, T…☆29Updated 3 years ago
- Feature Fusion for Online Mutual Knowledge Distillation Code☆26Updated 4 years ago
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 — Carrying out CNN Channel Pruning in a White Box☆18Updated 3 years ago
- Code for "Self-Distillation as Instance-Specific Label Smoothing"☆16Updated 4 years ago
- ☆33Updated last year
- A pytorch implement of scalable neural netowrks.☆23Updated 4 years ago
- Auto-Prox-AAAI24☆13Updated last year
- Reproducing VID in CVPR2019 (on working)☆20Updated 5 years ago
- S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution Calibration (CVPR 2021)☆64Updated 3 years ago