samsucik / knowledge-distil-bert

Master's thesis project in collaboration with Rasa, focusing on knowledge distillation from BERT into different very small networks and analysis of the students' NLP capabilities.
13Updated 2 years ago

Alternatives and similar repositories for knowledge-distil-bert:

Users that are interested in knowledge-distil-bert are comparing it to the libraries listed below