☆34Aug 20, 2023Updated 2 years ago
Alternatives and similar repositories for Spherical-Knowledge-Distillation
Users that are interested in Spherical-Knowledge-Distillation are comparing it to the libraries listed below
Sorting:
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆31Nov 11, 2021Updated 4 years ago
- ☆58Jun 18, 2021Updated 4 years ago
- This is the implementaion of paper "Task-Oriented Feature Distillation"☆43Apr 25, 2022Updated 3 years ago
- The official implementation of [ACMMM2022] Pay Attention to Your Positive Pairs: Positive Pair Aware Contrastive Knowledge Distillation☆11May 27, 2023Updated 2 years ago
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆119Feb 9, 2021Updated 5 years ago
- Official implementation of the paper "Function-Consistent Feature Distillation" (ICLR 2023)☆30Jul 5, 2023Updated 2 years ago
- A pytorch implementation of the ICCV2021 workshop paper SimDis: Simple Distillation Baselines for Improving Small Self-supervised Models☆14Jul 15, 2021Updated 4 years ago
- Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/…☆12Dec 21, 2022Updated 3 years ago
- Codes for ECCV2020 paper "Improving Knowledge Distillation via Category Structure".☆10Mar 15, 2021Updated 4 years ago
- ☆47Sep 9, 2021Updated 4 years ago
- 🏆 The 2nd Place Submission to the CVPR2021-Evoked Emotion from Videos challenge.☆17Jun 7, 2021Updated 4 years ago
- The code for Joint Neural Architecture Search and Quantization☆14Apr 10, 2019Updated 6 years ago
- Code for "Self-Distillation as Instance-Specific Label Smoothing"☆16Oct 22, 2020Updated 5 years ago
- Fork of diux-dev/imagenet18☆16Oct 4, 2018Updated 7 years ago
- WeightNet: Revisiting the Design Space of Weight Networks☆19Feb 20, 2021Updated 5 years ago
- SHAKE☆18Apr 14, 2023Updated 2 years ago
- ☆17Jul 10, 2022Updated 3 years ago
- [NeurIPS 2019] E2-Train: Training State-of-the-art CNNs with Over 80% Less Energy