samsucik / knowledge-distil-bert

Master's thesis project in collaboration with Rasa, focusing on knowledge distillation from BERT into different very small networks and analysis of the students' NLP capabilities.
12Updated last year

Related projects: