taishan1994 / pytorch_knowledge_distillation
基于Pytorch的知识蒸馏(中文文本分类)
☆18Updated 2 years ago
Alternatives and similar repositories for pytorch_knowledge_distillation:
Users that are interested in pytorch_knowledge_distillation are comparing it to the libraries listed below
- 基于bert_mrc的中文命名实体识别☆44Updated 2 years ago
- 基于pytorch的GlobalPointer进行中文命名实体识别。☆37Updated last year
- 复现论文《Simplify the Usage of Lexicon in Chinese NER》☆42Updated 3 years ago
- GPLinker_pytorch☆81Updated 2 years ago
- pytorch Efficient GlobalPointer☆53Updated 3 years ago
- 嵌套命名实体识别 Nested NER☆20Updated 3 years ago
- 基于pytorch的TPLinker_plus进行中文命名实体识别☆18Updated last year
- 百度2021年语言与智能技术竞赛多形态信息抽取赛道关系抽取部分torch版baseline☆52Updated 4 years ago
- 基于pytorch+bilstm_crf的中文命名实体识别☆14Updated 2 years ago
- Explore different chinese nlp tasks by using t5/mt5/t5-pegasus like text-classification, text-summary and so on.☆30Updated 2 years ago
- NER任务SOTA模型BERT_MRC☆61Updated last year
- 本项目是NLP领域一些任务的基准模型实现,包括文本分类、命名实体识别、实体关系抽取、NL2SQL、CKBQA以及BERT的各种下游任务应用。☆47Updated 4 years ago
- [Unofficial] Predict code for AAAI 2022 paper: Unified Named Entity Recognition as Word-Word Relation Classification☆53Updated 2 years ago
- bert-flat 简化版 添加了很多注释☆15Updated 3 years ago
- 基于BERT和MRC框架实现的嵌套命名实体识别☆19Updated 3 years ago
- ☆21Updated 3 years ago
- 使用BERT-BiLSTM+CRF进行ner任务(pytorch_lightning版)☆44Updated 2 years ago
- 基于pytorch的百度UIE命名实体识别。☆57Updated 2 years ago
- Summary and comparison of Chinese classification models☆34Updated 2 years ago
- 继续预训练中文bert☆30Updated 3 years ago
- 多模型中文cnews新闻文本分类☆55Updated 5 years ago
- 利用 HMM、BiLSTM-CRF 及 ALBERT 模型进行中文命名实体识别☆23Updated 2 years ago
- PyTorch使用BERT进行英语多标签文本分类☆33Updated 3 years ago
- 面向金融领域的篇章级事件抽取和事件因果关系抽取 第六名 方案及代码☆62Updated 3 years ago
- 基于BERT-MRC(阅读理解)的命名实体识别模型☆20Updated 3 years ago
- 使用R-BERT模型对人物关系模型进行分类,效果有显著提升。☆24Updated 2 years ago
- GlobalPointer的优化版/NER实体识别☆120Updated 3 years ago
- 该项目是自己做的一些nlp的实验,包括命名实体识别、实体关系抽取和事件抽取,未来会持续更新。☆34Updated last year
- NLP实验:新词挖掘+预训练模型继续Pre-training☆47Updated last year
- 多标签文本分类☆30Updated 3 years ago