ChenRocks / Distill-BERT-Textgen

Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".
131Updated 3 years ago

Alternatives and similar repositories for Distill-BERT-Textgen:

Users that are interested in Distill-BERT-Textgen are comparing it to the libraries listed below