natureLanguageQing / medical_ner
医学命名实体识别数据集制作
☆18Updated 3 years ago
Alternatives and similar repositories for medical_ner:
Users that are interested in medical_ner are comparing it to the libraries listed below
- 搜索所有中文NLP数据集,附常用英文NLP数据集☆14Updated 4 years ago
- NER任务SOTA模型BERT_MRC☆60Updated 10 months ago
- GPLinker_pytorch☆80Updated 2 years ago
- NLP关系抽取:序列标注、层叠式指针网络、Multi-head Selection、Deep Biaffine Attention☆100Updated 3 years ago
- 基于PaddleNLP开源的抽取式UIE进行医学命名实体识别(torch实现)☆42Updated 2 years ago
- 基于bert_mrc的中文命名实体识别☆43Updated 2 years ago
- CCKS2020 面向中文短文本的实体链指任务。主要思路为:使用基于BiLSTM和Attention的语义模型进行Query和Doc的文本匹配,再针对匹配度进行pairwise排序,从而选出最优的知识库实体。☆47Updated 3 years ago
- ☆18Updated 2 years ago
- [Unofficial] Predict code for AAAI 2022 paper: Unified Named Entity Recognition as Word-Word Relation Classification☆52Updated 2 years ago
- 通用kbqa,训练数据来源于ccks2018和2019,图谱数据爬取于百度百科☆24Updated 4 years ago
- 本NER项目包含多个中文数据集,模型采用BiLSTM+CRF、BERT+Softmax、BERT+Cascade、BERT+WOL等,最后用TFServing进行模型部署,线上推理和线下推理。☆80Updated 3 years ago
- ccks2020 NER competitions☆115Updated 4 years ago
- 基于bert的中文实体链接☆30Updated 3 years ago
- 中文医疗命名实体识别☆37Updated 4 years ago
- 电子病历实体命名识别☆35Updated 6 years ago
- code for ACL2020:《FLAT: Chinese NER Using Flat-Lattice Transformer》 我注释&修改&添加了部分源码,使得大家更容易复现这个代码。☆56Updated 4 years ago
- lic2020关系抽取比赛,使用Pytorch实现苏神的模型。☆102Updated 4 years ago
- 利用Bert_CRF进行中文分词☆19Updated 5 years ago
- Tplinker注释,中文数据集适配版本☆58Updated 3 years ago
- https://tianchi.aliyun.com/dataset/dataDetail?dataId=95414☆30Updated 3 years ago
- CMeEE/CBLUE/NER实体识别☆125Updated 2 years ago
- TPlinker for NER 中文/英文命名实体识别☆122Updated 3 years ago
- 百度2021年语言与智能技术竞赛多形态信息抽取赛道关系抽取部分torch版baseline☆51Updated 3 years ago
- 复现论文《Simplify the Usage of Lexicon in Chinese NER》☆42Updated 3 years ago
- 本项目是NLP领域一些任务的基准模型实现,包括文本分类、命名实体识别、实体关系抽取、NL2SQL、CKBQA以及BERT的各种下游任务应用。☆47Updated 3 years ago
- 基于pytorch的TPLinker_plus进行中文命名实体识别☆18Updated last year
- 实现了一下multi-head-selection联合关系实体抽取☆30Updated 5 years ago
- RoBERTa + BiLSTM + CRF for Chinese NER Task☆28Updated 3 years ago