☆47Sep 9, 2021Updated 4 years ago
Alternatives and similar repositories for KD_SRRL
Users that are interested in KD_SRRL are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Emonet unofficial Implemented "Estimation of continuous valence and arousal levels from faces in naturalistic conditions" published in Na…☆21Dec 20, 2022Updated 3 years ago
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆78Jul 29, 2024Updated last year
- This is the implementaion of paper "Task-Oriented Feature Distillation"☆43Apr 25, 2022Updated 3 years ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆103Jun 16, 2022Updated 3 years ago
- Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/…☆12Dec 21, 2022Updated 3 years ago
- Bare Metal GPUs on DigitalOcean Gradient AI • AdPurpose-built for serious AI teams training foundational models, running large-scale inference, and pushing the boundaries of what's possible.
- The official project website of "NORM: Knowledge Distillation via N-to-One Representation Matching" (The paper of NORM is published in IC…☆20Sep 18, 2023Updated 2 years ago
- This is a knowledge distillation toolbox based on mmsegmentation.☆47Nov 27, 2022Updated 3 years ago
- ☆128Nov 2, 2020Updated 5 years ago
- ☆12Nov 28, 2022Updated 3 years ago
- ☆218Mar 19, 2021Updated 5 years ago
- Distilling Knowledge via Knowledge Review, CVPR 2021☆277Dec 16, 2022Updated 3 years ago
- The tutorial of ECCV2022 WCPA Challenge Track2.☆29May 18, 2022Updated 3 years ago
- The implementation for "Comprehensive Knowledge Distillation with Causal Intervention".☆15Mar 12, 2022Updated 4 years ago
- Graph Knowledge Distillation☆13Mar 6, 2020Updated 6 years ago
- 1-Click AI Models by DigitalOcean Gradient • AdDeploy popular AI models on DigitalOcean Gradient GPU virtual machines with just a single click. Zero configuration with optimized deployments.
- This is the official repo for the CVPR 2021 L2ID paper "Distill on the Go: Online knowledge distillation in self-supervised learning"☆12Nov 15, 2021Updated 4 years ago
- [NeurIPS 2022] "Adversarial Training with Complementary Labels: On the Benefit of Gradually Informative Attacks"☆13Nov 11, 2022Updated 3 years ago
- Intra-class Feature Variation Distillation for Semantic Segmentation (ECCV 2020)☆72Sep 10, 2020Updated 5 years ago
- [ICML2024] DetKDS: Knowledge Distillation Search for Object Detectors☆19Jul 11, 2024Updated last year
- The official implementation of [ACMMM2022] Pay Attention to Your Positive Pairs: Positive Pair Aware Contrastive Knowledge Distillation☆11May 27, 2023Updated 2 years ago
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆103Apr 30, 2024Updated last year
- ☆266Nov 30, 2022Updated 3 years ago
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆237Dec 15, 2022Updated 3 years ago
- Bag of Instances Aggregation Boosts Self-supervised Distillation (ICLR 2022)☆33Apr 26, 2022Updated 3 years ago
- Wordpress hosting with auto-scaling - Free Trial • AdFully Managed hosting for WordPress and WooCommerce businesses that need reliable, auto-scalable performance. Cloudways SafeUpdates now available.
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆109Jun 18, 2020Updated 5 years ago
- (CVPR2022) Official PyTorch Implementation of KDEP. Knowledge Distillation as Efficient Pre-training: Faster Convergence, Higher Data-eff…☆60Jul 21, 2022Updated 3 years ago
- ☆38Jan 11, 2024Updated 2 years ago
- IEEE VCIP 2021: AnomalyHop: An SSL-based Image Anomaly Localization Method☆14Sep 18, 2021Updated 4 years ago
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distill…☆900Nov 5, 2023Updated 2 years ago
- Training Signal Annealing☆12Feb 11, 2020Updated 6 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,427Oct 16, 2023Updated 2 years ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆416May 17, 2021Updated 4 years ago
- StarGAN with a triple consistency loss☆13Nov 27, 2018Updated 7 years ago
- 1-Click AI Models by DigitalOcean Gradient • AdDeploy popular AI models on DigitalOcean Gradient GPU virtual machines with just a single click. Zero configuration with optimized deployments.
- Multiresolution Knowledge Distillation for Anomaly Detection,用于异常检测的多分辨率知识蒸馏☆17Jun 17, 2021Updated 4 years ago
- Official implementation for "Knowledge Distillation with Refined Logits".☆23Aug 26, 2024Updated last year
- ☆11Feb 19, 2021Updated 5 years ago
- [BMVC 2022] Information Theoretic Representation Distillation☆19Oct 6, 2023Updated 2 years ago
- ☆23May 6, 2022Updated 3 years ago
- [NeurIPS2022] Let Images Give You More: Point Cloud Cross-Modal Training for Shape Analysis☆73Jan 30, 2023Updated 3 years ago
- Awesome Knowledge-Distillation for CV☆94Apr 30, 2024Updated last year