luxuantao / distill_BERT_into_RNN-CNNLinks
复现论文《Distilling Task-Specific Knowledge from BERT into Simple Neural Networks》
☆15Updated 4 years ago
Alternatives and similar repositories for distill_BERT_into_RNN-CNN
Users that are interested in distill_BERT_into_RNN-CNN are comparing it to the libraries listed below
Sorting:
- 基于BERT的预训练语言模型实现,分为两步:预训练和微调。目前已包括BERT、Roberta、ALbert三个模型,且皆可支持Whole Word Mask模式。☆17Updated 5 years ago
- Python下shuffle几百G文件☆33Updated 3 years ago
- Code & Data for our Paper "PATTERN-BASED CHINESE HYPERNYM-HYPONYM RELATION EXTRACTION METHOD"☆12Updated 5 years ago
- Distilling Task-Specific Knowledge from BERT into Simple Neural Networks.