OSVAI / SSD-KDLinks
The official project website of "Small Scale Data-Free Knowledge Distillation" (SSD-KD for short, published in CVPR 2024).
☆17Updated 11 months ago
Alternatives and similar repositories for SSD-KD
Users that are interested in SSD-KD are comparing it to the libraries listed below
Sorting:
- Code for 'Multi-level Logit Distillation' (CVPR2023)☆64Updated 8 months ago
- Official implementation for "Knowledge Distillation with Refined Logits".☆14Updated 9 months ago
- The offical implementation of [NeurIPS2024] Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation https://ar…☆39Updated 5 months ago
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆174Updated 6 months ago
- A pytorch implementation of CVPR24 paper "D4M: Dataset Distillation via Disentangled Diffusion Model"☆31Updated 9 months ago
- CVPR 2023, Class Attention Transfer Based Knowledge Distillation☆44Updated last year
- ☆26Updated last year
- Official code for Scale Decoupled Distillation☆41Updated last year
- PyTorch code and checkpoints release for OFA-KD: https://arxiv.org/abs/2310.19444☆125Updated last year
- Low-Rank Rescaled Vision Transformer Fine-Tuning: A Residual Design Approach, CVPR 2024☆22Updated 10 months ago
- [ECCV 2024] Isomorphic Pruning for Vision Models☆68Updated 10 months ago
- a training-free approach to accelerate ViTs and VLMs by pruning redundant tokens based on similarity☆24Updated last week
- A curated list of awesome out-of-distribution detection resources.☆45Updated this week
- Official implementation of NeurIPS 2024 "Visual Fourier Prompt Tuning"☆28Updated 4 months ago
- ☆28Updated 11 months ago
- [ICCV23] Robust Mixture-of-Expert Training for Convolutional Neural Networks by Yihua Zhang, Ruisi Cai, Tianlong Chen, Guanhua Zhang, Hua…☆56Updated last year
- This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).☆78Updated 2 months ago
- The official project website of "NORM: Knowledge Distillation via N-to-One Representation Matching" (The paper of NORM is published in IC…☆20Updated last year
- Knowledge Amalgamation, Multi-teacher KD, Ensemble KD☆10Updated 3 years ago
- [CVPR'24] Official implementation of paper "FreeKD: Knowledge Distillation via Semantic Frequency Prompt".☆44Updated last year
- [ICCV 2023 oral] This is the official repository for our paper: ''Sensitivity-Aware Visual Parameter-Efficient Fine-Tuning''.☆70Updated last year
- The official implementation of "2024NeurIPS Dynamic Tuning Towards Parameter and Inference Efficiency for ViT Adaptation"☆46Updated 5 months ago
- The official implementation for paper: Improving Knowledge Distillation via Regularizing Feature Norm and Direction☆20Updated last year
- [AAAI 2024] Understanding the Role of the Projector in Knowledge Distillation☆18Updated last year
- Official PyTorch implementation of Which Tokens to Use? Investigating Token Reduction in Vision Transformers presented at ICCV 2023 NIVT …☆35Updated last year
- Official implementation for paper "Knowledge Diffusion for Distillation", NeurIPS 2023☆86Updated last year
- ICLR 2024, Towards Lossless Dataset Distillation via Difficulty-Aligned Trajectory Matching☆101Updated last year
- [CVPR 2024] VkD : Improving Knowledge Distillation using Orthogonal Projections☆53Updated 7 months ago
- Official PyTorch Code for "Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?" (https://arxiv.org/abs/2305.12954)☆47Updated last year
- [CVPR '25] Official implementation of the paper "Rethinking Few-Shot Adaptation of Vision-Language Models in Two Stages", accepted at (an…☆14Updated 2 months ago