Distillation for faster rcnn in classification,regression,feature level,feature level +mask
☆28Apr 1, 2021Updated 4 years ago
Alternatives and similar repositories for Distillation-of-Faster-rcnn
Users that are interested in Distillation-of-Faster-rcnn are comparing it to the libraries listed below
Sorting:
- Semi-supervised Adaptive Distillation is a model compression method for object detection.☆59Oct 9, 2019Updated 6 years ago
- Implementation of CVPR 2019 paper: Distilling Object Detectors with Fine-grained Feature Imitation☆421Jul 15, 2021Updated 4 years ago
- ☆25Sep 22, 2020Updated 5 years ago
- An Object Detection Knowledge Distillation framework powered by pytorch, now having SSD and yolov5.☆228Feb 7, 2022Updated 4 years ago
- MaskRCNN with Knowledge Distillation☆21Nov 6, 2020Updated 5 years ago
- Implementations of CVPR 2019 paper Distilling Object Detectors with Fine-grained Feature Imitation☆27Nov 22, 2022Updated 3 years ago
- In search of effective and efficient Pipeline for Distillating Knowledge in Convolutional Neural Networks