giakoumoglou / distillersLinks
[arXiv 2024] PyTorch implementation of RRD: https://arxiv.org/abs/2407.12073
☆13Updated 7 months ago
Alternatives and similar repositories for distillers
Users that are interested in distillers are comparing it to the libraries listed below
Sorting:
- Code for 'Multi-level Logit Distillation' (CVPR2023)☆70Updated last year
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆149Updated 2 years ago
- Official code for Scale Decoupled Distillation☆41Updated last year
- CVPR 2023, Class Attention Transfer Based Knowledge Distillation☆46Updated 2 years ago
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆179Updated 10 months ago
- Official PyTorch(MMCV) implementation of “Adversarial AutoMixup” (ICLR 2024 spotlight)☆69Updated 11 months ago
- [BMVC 2022] Official repository for "How to Train Vision Transformer on Small-scale Datasets?"☆163Updated last year
- The offical implementation of [NeurIPS2024] Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation https://ar…☆44Updated 10 months ago
- Official implementation of the paper "FedSIS: Federated Split Learning with Intermediate Representation Sampling for Privacy-preserving G…☆16Updated 2 years ago
- Learning Efficient Vision Transformers via Fine-Grained Manifold Distillation. NeurIPS 2022.☆32Updated 3 years ago
- Training ImageNet / CIFAR models with sota strategies and fancy techniques such as ViT, KD, Rep, etc.☆84Updated last year
- [ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of…☆109Updated 2 years ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆100Updated 3 years ago
- ☆32Updated 7 months ago
- A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', …☆181Updated 3 years ago
- Official code for Cumulative Spatial Knowledge Distillation for Vision Transformers (ICCV-2023) https://openaccess.thecvf.com/content/ICC…☆15Updated last year
- ☆28Updated last year
- PyTorch code and checkpoints release for VanillaKD: https://arxiv.org/abs/2305.15781☆76Updated last year
- (ICME24) This is the offical repository of iDAT: inverse Distillation Adapter-Tuning.☆13Updated last year
- ☆24Updated 3 years ago
- ☆28Updated 3 years ago
- ☆127Updated 4 years ago
- The offical implement of ImbSAM (Imbalanced-SAM)☆24Updated last year
- Official implementation of Bayes Conditional Distribution Estimation for Knowledge Distillation Based on Conditional Mutual Information☆11Updated 2 years ago
- Official PyTorch implementation of PS-KD☆90Updated 3 years ago
- This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).☆80Updated 7 months ago
- Unofficial PyTorch implementation of KeepAugment☆24Updated 3 years ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆50Updated 2 years ago
- ☆23Updated 2 years ago
- [IMWUT/UbiComp 2024] Optimization-Free Test-Time Adaptation for Cross-Person Activity Recognition☆21Updated last year