PyTorch code and checkpoints release for OFA-KD: https://arxiv.org/abs/2310.19444
☆136Apr 19, 2024Updated last year
Alternatives and similar repositories for OFAKD
Users that are interested in OFAKD are comparing it to the libraries listed below
Sorting:
- The official (TMLR) implementation of LumiNet: Perception-Driven Knowledge Distillation via Statistical Logit Calibration☆17Aug 17, 2025Updated 6 months ago
- CVPR 2023, Class Attention Transfer Based Knowledge Distillation☆46Jun 13, 2023Updated 2 years ago
- Official implementation for paper "Knowledge Diffusion for Distillation", NeurIPS 2023☆94Jan 24, 2024Updated 2 years ago
- ☆19Jan 16, 2026Updated last month
- Code for 'Multi-level Logit Distillation' (CVPR2023)