aitsc / GLMKDLinks

Are Intermediate Layers and Labels Really Necessary? A General Language Model Distillation Method ; GKD: A General Knowledge Distillation Framework for Large-scale Pre-trained Language Model
32Updated 2 years ago

Alternatives and similar repositories for GLMKD

Users that are interested in GLMKD are comparing it to the libraries listed below

Sorting: