samsucik / knowledge-distil-bert

Master's thesis project in collaboration with Rasa, focusing on knowledge distillation from BERT into different very small networks and analysis of the students' NLP capabilities.
12Updated 2 years ago

Related projects

Alternatives and complementary repositories for knowledge-distil-bert