ttttong / BERT-BILSTM-GCN-CRF-for-NER
在原本BERT-BILSTM-CRF上融合GCN和词性标签等做NER任务
☆26Updated 3 years ago
Related projects ⓘ
Alternatives and complementary repositories for BERT-BILSTM-GCN-CRF-for-NER
- NER任务SOTA模型BERT_MRC☆59Updated 8 months ago
- ☆27Updated last year
- 实体关系联合抽取模型/ My project on joint exraction of entities and relations☆20Updated 2 years ago
- 支持百度竞赛数据的中文事件抽取,支持ace2005数据的英文事件抽取,本人将苏神的三元组抽取算法中的DGCNN改成了事件抽取任务,并将karas改成了本人习惯使用的pytorch,在数据加载处考虑了各种语言的扩展☆48Updated 3 years ago
- ☆40Updated last year
- 一个简单的中文事件抽取模 型,触发词和实体联合标注识别,同时判定实体角色。☆70Updated 3 years ago
- 嵌套命名实体识别 Nested NER☆19Updated 3 years ago
- 复现论文《Simplify the Usage of Lexicon in Chinese NER》☆40Updated 3 years ago
- PRGC: Potential Relation and Global Correspondence Based Joint Relational Triple Extraction☆111Updated 3 years ago
- 基于Bert模型的关系抽取和实体识别、Entity Extraction and Relation Extract using Bert☆11Updated 4 years ago
- Source code for ACL 2021 finding paper: CasEE: A Joint Learning Framework with Cascade Decoding for Overlapping Event Extraction.☆80Updated 2 years ago
- ☆13Updated 2 years ago
- Effective Cascade Dual-Decoder Model for Joint Entity and Relation Extraction.☆17Updated 2 years ago
- NLP关系抽取:序列标注、层叠式指针网络、Multi-head Selection、Deep Biaffine Attention☆99Updated 3 years ago
- 百度2021年语言与智能技术竞赛多形态信息抽取赛道关系抽取部分torch版baseline☆50Updated 3 years ago
- TPlinker for NER 中文/英文命名实体识别☆122Updated 3 years ago
- 中文事件抽取☆11Updated 3 years ago
- Code for Label Semantics for Few Shot Named Entity Recognition☆54Updated last year
- Code for ACL 2021 paper. MECT: Multi-Metadata Embedding based Cross-Transformer for Chinese Named Entity Recognition.☆67Updated 3 years ago
- ☆29Updated 3 years ago
- 利用BERT+BILSTM/DGCNN+ATTENTION+CRF 解决中文NER任务☆33Updated 2 years ago
- ☆17Updated 2 years ago
- 基于bert_mrc的中文命名实体识别☆43Updated 2 years ago
- 面向金融领域的篇章级事件抽取和事件因果关系抽取 第六名 方案及代码☆59Updated 2 years ago
- 使用BERT-BiLSTM+CRF进行ner任务(pytorch_lightning版)☆40Updated last year
- ☆41Updated 3 years ago
- 基于pytorch+bert的中文关系抽取☆28Updated 2 years ago
- 基于pytorch+bert的中文事件抽取☆65Updated 2 years ago