BERT distillation(基于BERT的蒸馏实验 )
☆314Jul 30, 2020Updated 5 years ago
Alternatives and similar repositories for bert_distill
Users that are interested in bert_distill are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Knowledge Distillation from BERT☆54Jan 7, 2019Updated 7 years ago
- Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。☆229Jul 27, 2022Updated 3 years ago
- Distilling Task-Specific Knowledge from BERT into Simple Neural Networks.☆15Aug 28, 2020Updated 5 years ago
- pytorch implementation for Patient Knowledge Distillation for BERT Model Compression☆204Sep 20, 2019Updated 6 years ago
- Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.☆3,157Jan 22, 2024Updated 2 years ago
- NordVPN Special Discount Offer • AdSave on top-rated NordVPN 1 or 2-year plans with secure browsing, privacy protection, and support for for all major platforms.
- 简洁易用版TinyBert:基于Bert进行知识蒸馏的预训练语言模型☆271Oct 24, 2020Updated 5 years ago
- A PyTorch-based knowledge distillation toolkit for natural language processing☆1,697May 8, 2023Updated 2 years ago
- ☆61Nov 14, 2019Updated 6 years ago
- bert蒸馏实践,包含BiLSTM蒸馏BERT和TinyBert☆13Apr 23, 2022Updated 3 years ago
- 对ACL2020 FastBERT论文的复现,论文地址//arxiv.org/pdf/2004.02178.pdf☆192Dec 15, 2021Updated 4 years ago
- The score code of FastBERT (ACL2020)☆609Oct 29, 2021Updated 4 years ago