lliai / SHAKELinks
SHAKE
☆18Updated 2 years ago
Alternatives and similar repositories for SHAKE
Users that are interested in SHAKE are comparing it to the libraries listed below
Sorting:
- ACCV2022 Source Code of paper "Feature Decoupled Knowledge Distillation via Spatial Pyramid Pooling"☆12Updated 2 years ago
- Switchable Online Knowledge Distillation☆19Updated last year
- [ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of…☆110Updated 3 years ago
- Pytorch implementation of Split to Merge: Unifying Separated Modalities for Unsupervised Domain Adaptation (CVPR'24)☆35Updated 3 months ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆100Updated 3 years ago
- TF-FD☆20Updated 3 years ago
- [ICASSP-2021] Official implementations of Multi-View Contrastive Learning for Online Knowledge Distillation (MCL-OKD)☆27Updated 4 years ago
- CVPR 2023, Class Attention Transfer Based Knowledge Distillation☆46Updated 2 years ago
- Official implementation of the paper "Masked Autoencoders are Efficient Class Incremental Learners"☆45Updated last year
- Learning Efficient Vision Transformers via Fine-Grained Manifold Distillation. NeurIPS 2022.☆33Updated 3 years ago
- Official repository for "Self-Distilled Vision Transformer for Domain Generalization" (ACCV-2022 ORAL)☆41Updated 3 years ago
- Official repository for ACCV 2020 paper 'Class-Wise Difficulty-Balanced Loss for Solving Class-Imbalance'☆19Updated 4 years ago
- An open-world scenario domain generalization code base☆27Updated 2 years ago
- Awesome Knowledge-Distillation for CV☆91Updated last year
- [ECCV 2022] Implementation of the paper "Locality Guidance for Improving Vision Transformers on Tiny Datasets"☆82Updated 3 years ago
- ☆45Updated 2 years ago
- Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/…☆12Updated 3 years ago
- Code for 'Multi-level Logit Distillation' (CVPR2023)☆71Updated last year
- ☆47Updated 4 years ago
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆77Updated last year
- Official code for Scale Decoupled Distillation☆43Updated last year
- ☆28Updated 2 years ago
- ☆33Updated 10 months ago
- [TPAMI-2023] Official implementations of L-MCL: Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition☆26Updated 2 years ago
- [NeurIPS 2021] SSUL: Semantic Segmentation with Unknown Label for Exemplar-based Class-Incremental Learning☆63Updated 2 years ago
- [NeurIPS'22] Projector Ensemble Feature Distillation☆30Updated 2 years ago
- CVPR 2022 - official implementation for "Long-Tailed Recognition via Weight Balancing" https://arxiv.org/abs/2203.14197☆128Updated last year
- ☆40Updated 4 years ago
- The official project website of "NORM: Knowledge Distillation via N-to-One Representation Matching" (The paper of NORM is published in IC…☆20Updated 2 years ago
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆43Updated 3 years ago