lgcnsai / PS-KD-PytorchLinks
Official PyTorch implementation of PS-KD
☆88Updated 2 years ago
Alternatives and similar repositories for PS-KD-Pytorch
Users that are interested in PS-KD-Pytorch are comparing it to the libraries listed below
Sorting:
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆98Updated 3 years ago
- [ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of…☆108Updated 2 years ago
- A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', …☆179Updated 3 years ago
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆40Updated 2 years ago
- ☆127Updated 4 years ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆147Updated 2 years ago
- The official codes of our CVPR-2023 paper: Sharpness-Aware Gradient Matching for Domain Generalization☆75Updated 2 years ago
- Official implementation of the paper "Function-Consistent Feature Distillation" (ICLR 2023)☆30Updated 2 years ago
- CVPR 2022 - official implementation for "Long-Tailed Recognition via Weight Balancing" https://arxiv.org/abs/2203.14197☆127Updated 7 months ago
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆176Updated 7 months ago
- [CVPR2022] This repository contains code for the paper "Nested Collaborative Learning for Long-Tailed Visual Recognition", published at C…☆88Updated last year
- This is a method of dataset condensation, and it has been accepted by CVPR-2022.☆70Updated last year
- [ICLR 2022] Official pytorch implementation of "Uncertainty Modeling for Out-of-Distribution Generalization" in International Conference …☆159Updated 3 years ago
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆75Updated 11 months ago
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆108Updated 5 years ago
- [ICLR 2023 Spotlight] Divide to Adapt: Mitigating Confirmation Bias for Domain Adaptation of Black-Box Predictors☆39Updated 2 years ago
- ☆27Updated 3 years ago
- ☆93Updated 3 years ago
- CVPR 2023, Class Attention Transfer Based Knowledge Distillation☆44Updated 2 years ago
- Parametric Contrastive Learning (ICCV2021) & GPaCo (TPAMI 2023)☆253Updated 9 months ago
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆117Updated 4 years ago
- An official implementation of ECCV 2022 paper "Attention Diversification for Domain Generalization".☆43Updated 2 years ago
- The implementation for "Comprehensive Knowledge Distillation with Causal Intervention".☆14Updated 3 years ago
- ResLT: Residual Learning for Long-tailed Recognition (TPAMI 2022)☆61Updated last year
- Code for ICML 2022 paper — Efficient Test-Time Model Adaptation without Forgetting☆128Updated 2 years ago
- Official Implementation of Curriculum of Data Augmentation for Long-tailed Recognition (CUDA) (ICLR'23 Spotlight)☆21Updated 2 years ago
- ☆47Updated 3 years ago
- [CVPR 2023] This repository includes the official implementation our paper "Masked Autoencoders Enable Efficient Knowledge Distillers"☆106Updated last year
- code for our ICCV 2021 paper 'Generalized Source-free Domain Adaptation'☆106Updated 3 years ago
- Code for 'Multi-level Logit Distillation' (CVPR2023)☆66Updated 9 months ago