aitsc / GLMKD

Are Intermediate Layers and Labels Really Necessary? A General Language Model Distillation Method ; GKD: A General Knowledge Distillation Framework for Large-scale Pre-trained Language Model
31Updated last year

Alternatives and similar repositories for GLMKD:

Users that are interested in GLMKD are comparing it to the libraries listed below