nocater / text_gcn
原始项目:[GCN_AAAI2019](https://github.com/yao8839836/text_gcn/) add:多标签分类支持
☆18Updated 5 years ago
Alternatives and similar repositories for text_gcn:
Users that are interested in text_gcn are comparing it to the libraries listed below
- 使用分层注意力机制 HAN + 多任务学习 解决 AI Challenger 细粒度用户评论情感分析 。https://challenger.ai/competition/fsauor2018☆58Updated 5 years ago
- 命名实体消歧的实现☆41Updated 6 years ago
- 主要和大家分享今年2019年的ACL paper☆31Updated 5 years ago
- 相似案例匹配☆46Updated 5 years ago
- CCKS2020面向金融领域的小样本跨类迁移事件抽取baseline☆55Updated 2 years ago
- 实体链接demo☆65Updated 6 years ago
- paper reading☆20Updated 6 years ago
- Source code of the paper "Attention as Relation: Learning Supervised Multi-head Self-Attention for Relation Extraction, IJCAI 2020."☆28Updated 4 years ago
- 关系抽取个人实战总结以及开源工具包使用☆56Updated 6 years ago
- bilstm _Attention_crf☆37Updated 6 years ago
- 2019语言与智能技术竞赛 信息抽取(Information Extraction) 个人baseline with BERT☆18Updated 5 years ago
- Text classification in Tensorflow1.x☆24Updated 2 years ago
- 2019搜狐校园算法大赛☆27Updated 5 years ago
- Global Normalization of Convolutional Neural Networks for Joint Entity and Relation Classification☆18Updated 5 years ago
- ☆28Updated last year
- 实现了一下multi-head-selection联合关系实体抽取☆30Updated 5 years ago
- baseline for ccks2019-ipre☆48Updated 5 years ago
- FastGCN for inductive text classification☆85Updated 5 years ago
- 2019百度语言与智能技术竞赛信息抽取赛代5名代码☆69Updated 5 years ago
- Relation Extraction 中文关系提取☆72Updated 6 years ago
- CCKS2019面向金融领域的事件主体抽取☆47Updated 5 years ago
- PyTorch Bert Text Classification☆31Updated 6 years ago
- 基于attention的文本分类,并且提供了可视化界面☆9Updated 6 years ago
- ☆31Updated 6 years ago
- CCKS2019-人物关系抽取☆74Updated 5 years ago
- use google pre-training model bert to fine-tuning for the chinese multiclass classification☆40Updated 6 years ago
- pytorch用Textcnn-bilstm-crf模型实现命名实体识别☆41Updated 6 years ago
- Capsule, LSTM/GRU, CNN for text class implemented by Pytorch 胶囊网络, 循环神经网络和卷积神经网络在中文文本分类中的应用☆43Updated 6 years ago
- ☆31Updated 6 years ago
- NLP Keras BiLSTM+CRF☆52Updated 6 years ago