chenywang / hierarchical_attention_model
基于attention的文本分类,并且提供了可视化界面
☆9Updated 5 years ago
Related projects ⓘ
Alternatives and complementary repositories for hierarchical_attention_model
- 蚂蚁金服比赛 15th/2632☆47Updated 6 years ago
- 基于BiLSTM和Self-Attention的文本分类、表示学习网络☆29Updated 5 years ago
- baseline for ccks2019-ipre☆49Updated 5 years ago
- notes and codes about NLP☆24Updated 5 years ago
- Joint Extraction of Entity Mentions and Relations without Dependency Trees☆21Updated 6 years ago
- 基于MP-CNN的中文句子相似度计算☆13Updated 6 years ago
- AI Challenger 2018 细粒度用户评论情感分析比赛 个人baseline项目☆14Updated 6 years ago
- 2019达观杯 第六名代码☆44Updated last year
- bilstm _Attention_crf☆37Updated 5 years ago
- 2019年4月8日,第三届搜狐校园内容识别算法大赛。☆25Updated 5 years ago
- Relation Extraction 中文关系提取☆72Updated 6 years ago
- 2019百度语言与智能技术竞赛信息抽取赛代5名代码☆69Updated 5 years ago
- 原始项目:[GCN_AAAI2019](https://github.com/yao8839836/text_gcn/) add:多标签分类支持☆18Updated 4 years ago
- ☆32Updated 5 years ago
- 2019之江杯人工智能大赛电商评论观点挖掘赛道top3☆45Updated 5 years ago
- NER for Chinese electronic medical records. Use doc2vec, self_attention and multi_attention.☆26Updated 6 years ago
- CCKS 2018 开放领域的中文问答任务 1st 解决方案☆111Updated 5 years ago
- ☆12Updated 3 years ago
- 命名实体消歧的实现☆41Updated 5 years ago
- CCKS2019面向金融领域的事件主体抽取☆48Updated 5 years ago
- CSDN博客的关键词提取算法,融合TF,IDF,词性,位置等多特征。该项目用于参加2017 SMP用户画像测评,排名第四,在验证集中精度为59.9%,在最终集中精度为58.7%。启发式的方法,通用性强。☆30Updated 6 years ago
- Bert中文文本分类☆40Updated 5 years ago
- code for Baidu machine reading comprehension competition☆10Updated 5 years ago
- 相似案例匹配☆46Updated 5 years ago
- 2019搜狐校园算法大赛☆27Updated 5 years ago
- ☆17Updated 5 years ago
- A Keras Implementation of Attention_based Siamese Manhattan LSTM☆54Updated 6 years ago
- 基于句法分析的命名实体关系抽取程序☆65Updated 8 years ago
- 实体链接demo☆64Updated 5 years ago
- A TensorFlow implementation of the paper _End-to-End Relation Extraction using LSTMs on Sequences and Tree Structures_ (https://www.aclwe…☆29Updated 6 years ago