samsucik / knowledge-distil-bertView on GitHub
Master's thesis project in collaboration with Rasa, focusing on knowledge distillation from BERT into different very small networks and analysis of the students' NLP capabilities.
13Sep 30, 2022Updated 3 years ago

Alternatives and similar repositories for knowledge-distil-bert

Users that are interested in knowledge-distil-bert are comparing it to the libraries listed below

Sorting:

Are these results useful?