ghwang-s / abkdLinks
ICML 2025 Oral: ABKD: Pursuing a Proper Allocation of the Probability Mass in Knowledge Distillation via α-β-Divergence
☆39Updated 2 months ago
Alternatives and similar repositories for abkd
Users that are interested in abkd are comparing it to the libraries listed below
Sorting:
- ☆51Updated 11 months ago
 - Awesome-Low-Rank-Adaptation☆118Updated last year
 - OOD Generalization相关文章的阅读笔记☆32Updated 10 months ago
 - Instruction Tuning in Continual Learning paradigm☆62Updated 8 months ago
 - A curated list of awesome papers on dataset reduction, including dataset distillation (dataset condensation) and dataset pruning (coreset…☆58Updated 9 months ago
 - [CVPR 2024] On the Diversity and Realism of Distilled Dataset: An Efficient Dataset Distillation Paradigm☆75Updated 8 months ago
 - [ICLR 2025 Oral🔥] SD-LoRA: Scalable Decoupled Low-Rank Adaptation for Class Incremental Learning☆60Updated 4 months ago
 - [ICML 2025] Official code of "DAMA: Data- and Model-aware Alignment of Multi-modal LLMs"