dvlab-research / ReviewKDLinks
Distilling Knowledge via Knowledge Review, CVPR 2021
☆276Updated 2 years ago
Alternatives and similar repositories for ReviewKD
Users that are interested in ReviewKD are comparing it to the libraries listed below
Sorting:
- Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"☆190Updated last year
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆75Updated last year
- Masked Generative Distillation (ECCV 2022)☆228Updated 2 years ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆149Updated 2 years ago
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆101Updated last year
- ResRep: Lossless CNN Pruning via Decoupling Remembering and Forgetting (ICCV 2021)☆297Updated 2 years ago
- RM Operation can equivalently convert ResNet to VGG, which is better for pruning; and can help RepVGG perform better when the depth is la…☆211Updated 2 years ago
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distill…☆872Updated last year
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆119Updated 4 years ago
- RepMLPNet: Hierarchical Vision MLP with Re-parameterized Locality (CVPR 2022)☆306Updated 2 years ago
- ☆193Updated 4 years ago
- Paraphrasing Complex Network: Network Compression via Factor Transfer Code (NeurIPS 2018)☆20Updated 5 years ago
- (CVPR 2021, Oral) Dynamic Slimmable Network