[PR 2024] Official PyTorch Code for "Dual Teachers for Self-Knowledge Distillation"
☆13Nov 28, 2024Updated last year
Alternatives and similar repositories for DTSKD
Users that are interested in DTSKD are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- SelecMix: Debiased Learning by Contradicting-pair Sampling (NeurIPS 2022)☆13Jun 5, 2024Updated last year
- ☆10Aug 1, 2025Updated 8 months ago
- Contrastive Continual Learning with Importance Sampling and Prototype-Instance Relation Distillation☆12Jul 22, 2024Updated last year
- ☆19Mar 23, 2025Updated last year
- Create your name tag with M5Paper series!☆18Feb 23, 2026Updated last month
- Managed Database hosting by DigitalOcean • AdPostgreSQL, MySQL, MongoDB, Kafka, Valkey, and OpenSearch available. Automatically scale up storage and focus on building your apps.
- Regularly Truncated M-estimators for Learning with Noisy Labels☆12Apr 24, 2024Updated last year
- [NeurIPS'22] Projector Ensemble Feature Distillation☆30Jan 4, 2024Updated 2 years ago
- ☆11Nov 18, 2024Updated last year
- ☆12Dec 19, 2024Updated last year
- SSD-KD: A self-supervised diverse knowledge distillation method for lightweight skin lesion classification using dermoscopic images☆11Apr 11, 2023Updated 3 years ago
- Official repository for the paper DL2PA: Hyperspherical Classification with Dynamic Label-to-Prototype Assignment (CVPR 2024).