AristotelisPap / Question-Answering-with-BERT-and-Knowledge-Distillation
View external linksLinks

Fine-tuned BERT on SQuAd 2.0 Dataset. Applied Knowledge Distillation (KD) and fine-tuned DistilBERT (student) using BERT as the teacher model. Reduced the size of the original BERT by 40%.
26Feb 13, 2021Updated 5 years ago

Alternatives and similar repositories for Question-Answering-with-BERT-and-Knowledge-Distillation

Users that are interested in Question-Answering-with-BERT-and-Knowledge-Distillation are comparing it to the libraries listed below

Sorting:

Are these results useful?