intersun / PKD-for-BERT-Model-Compression

pytorch implementation for Patient Knowledge Distillation for BERT Model Compression
200Updated 5 years ago

Alternatives and similar repositories for PKD-for-BERT-Model-Compression:

Users that are interested in PKD-for-BERT-Model-Compression are comparing it to the libraries listed below