Houlong66 / lattice_lstm_with_pytorchLinks
☆24Updated 6 years ago
Alternatives and similar repositories for lattice_lstm_with_pytorch
Users that are interested in lattice_lstm_with_pytorch are comparing it to the libraries listed below
Sorting:
- A Neural Multi-digraph Model for Chinese NER with Gazetteers☆86Updated 10 months ago
- This is the repository for NLPCC2020 task AutoIE☆51Updated 5 years ago
- ☆29Updated 6 years ago
- Code for the NLPCC 2018 paper: Distant Supervision for Relation Extraction with Neural Instance Selector☆12Updated 6 years ago
- CCKS 2019 Task 2: Entity Recognition and Linking☆94Updated 5 years ago
- 2019 语言与智能技术竞赛-知识驱动对话 B榜第5名源码和模型☆25Updated 5 years ago
- ☆23Updated 6 years ago
- Chinese Named Entity Recognition Using Neural Network☆29Updated 2 years ago
- baseline for ccks2019-ipre☆48Updated 5 years ago
- Hierarchical Neural Relation Extraction☆96Updated 4 years ago
- 2019 语言与智能技术竞赛-知识驱动对话 B榜第5名源码和模型☆27Updated 5 years ago
- ☆23Updated 6 years ago
- ene to end neural coreference resolution(forked from https://github.com/kentonl/e2e-coref and make some little change)☆20Updated 6 years ago
- Chinese Open Entity-Relation Knowledge Base☆36Updated 7 years ago
- The code for "A Unified MRC Framework for Named Entity Recognition"☆33Updated 5 years ago
- ☆112Updated 7 years ago
- For the new students who just join a NLP group☆27Updated 7 years ago
- Code for NAACL2019 paper "An Encoding Strategy Based Word-Character LSTM for Chinese NER".☆65Updated 5 years ago
- ☆31Updated 6 years ago
- notes and codes about NLP☆24Updated 6 years ago
- ☆23Updated 6 years ago
- 本代码是cs224n的作业2代码☆18Updated 6 years ago
- init☆21Updated 6 years ago
- ☆59Updated 5 years ago
- Source code of the paper "Attention as Relation: Learning Supervised Multi-head Self-Attention for Relation Extraction, IJCAI 2020."☆28Updated 5 years ago
- Revised Version of SAT Model in "Improved Word Representation Learning with Sememes"☆49Updated 4 years ago
- ☆21Updated 5 years ago
- ☆25Updated 5 years ago
- this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large☆65Updated 5 years ago
- Source code and data for our long paper (Wu et al., 2019)☆53Updated 5 years ago