☆34Aug 20, 2023Updated 2 years ago
Alternatives and similar repositories for Spherical-Knowledge-Distillation
Users that are interested in Spherical-Knowledge-Distillation are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- This is the implementaion of paper "Task-Oriented Feature Distillation"☆43Apr 25, 2022Updated 3 years ago
- ☆58Jun 18, 2021Updated 4 years ago
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆44Sep 27, 2022Updated 3 years ago
- ☆47Sep 9, 2021Updated 4 years ago
- Official implementation of the paper "Function-Consistent Feature Distillation" (ICLR 2023)☆30Jul 5, 2023Updated 2 years ago
- Code for "Self-Distillation as Instance-Specific Label Smoothing"☆16Oct 22, 2020Updated 5 years ago
- The official implementation of [ACMMM2022] Pay Attention to Your Positive Pairs: Positive Pair Aware Contrastive Knowledge Distillation☆11May 27, 2023Updated 2 years ago
- [NeurIPS'22] Projector Ensemble Feature Distillation☆30Jan 4, 2024Updated 2 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆584Feb 15, 2023Updated 3 years ago
- A simple, fast, efficient and end-to-end 3D object detector without NMS.☆30Nov 30, 2021Updated 4 years ago
- EfficientVLM: Fast and Accurate Vision-Language Models via Knowledge Distillation and Modal-adaptive Pruning (ACL 2023)☆34Jul 18, 2023Updated 2 years ago
- ☆19May 28, 2020Updated 5 years ago
- 2019~2021年间Zero-shot/Data-free知识蒸馏的论文合集☆11Sep 8, 2021Updated 4 years ago
- A pytorch implementation of the ICCV2021 workshop paper SimDis: Simple Distillation Baselines for Improving Small Self-supervised Models☆14Jul 15, 2021Updated 4 years ago
- This project attacked widely-used EEG spellers, e.g., P300 spellers and SSVEP spellers, with adversarial perturbation templates. We cons…☆14Jul 14, 2020Updated 5 years ago
- This is the code of CVPR'20 paper "Distilling Cross-Task Knowledge via Relationship Matching".☆49Feb 7, 2021Updated 5 years ago
- Distilling Knowledge via Intermediate Classifiers☆16Oct 3, 2021Updated 4 years ago
- ☆10Feb 22, 2022Updated 4 years ago
- kNN-TL: k-Nearest-Neighbor Transfer Learning for Low-Resource Neural Machine Translation (ACL2023)☆11Jul 26, 2023Updated 2 years ago
- An Open-Source GUI for CNN-based EEG Decoding and Model Interpretation☆19Sep 8, 2022Updated 3 years ago
- ☆24May 6, 2022Updated 3 years ago
- SEED: Self-supervised Distillation for Visual Representation☆16Jul 20, 2022Updated 3 years ago
- 🏆 The 2nd Place Submission to the CVPR2021-Evoked Emotion from Videos challenge.☆17Jun 7, 2021Updated 4 years ago
- channel pruning for accelerating very deep neural networks☆13Mar 8, 2021Updated 5 years ago
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆106Sep 9, 2019Updated 6 years ago
- DropNet: Reducing Neural Network Complexity via Iterative Pruning (ICML 2020)☆16Aug 24, 2020Updated 5 years ago
- official pytorch implementation of "Deep Metric Learning with Spherical Embedding", NeurIPS 2020☆42Nov 28, 2020Updated 5 years ago
- ☆11Nov 5, 2024Updated last year
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆181Dec 3, 2024Updated last year
- Context-I2W: Mapping Images to Context-dependent words for Accurate Zero-Shot Composed Image Retrieval [AAAI 2024 Oral]☆55May 27, 2025Updated 9 months ago
- Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/…☆12Dec 21, 2022Updated 3 years ago
- ☆114Apr 21, 2021Updated 4 years ago
- Codebase for " Reducing Representation Drift in Online Continual Learning"☆14Jun 8, 2021Updated 4 years ago
- The comparsion methods code☆12Mar 7, 2022Updated 4 years ago
- ACCV2022 Source Code of paper "Feature Decoupled Knowledge Distillation via Spatial Pyramid Pooling"☆12Jul 5, 2023Updated 2 years ago
- ☆11Nov 11, 2021Updated 4 years ago
- ☆37Feb 1, 2022Updated 4 years ago
- The official project website of "NORM: Knowledge Distillation via N-to-One Representation Matching" (The paper of NORM is published in IC…☆20Sep 18, 2023Updated 2 years ago
- The official implementation of CVPR 2021 Paper: Improving Weakly Supervised Visual Grounding by Contrastive Knowledge Distillation.☆12Oct 15, 2021Updated 4 years ago