songmzhang / DSKDView on GitHub
Repo for the EMNLP'24 Paper "Dual-Space Knowledge Distillation for Large Language Models". A general white-box KD framework for both same-tokenizer and cross-tokenizer LLM distillation.
61Mar 21, 2026Updated last week

Alternatives and similar repositories for DSKD

Users that are interested in DSKD are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?