sys1874 / seq2seq-model-for-Sohu-2019Links
完全端到端的核心实体识别与情感预测
☆35Updated 6 years ago
Alternatives and similar repositories for seq2seq-model-for-Sohu-2019
Users that are interested in seq2seq-model-for-Sohu-2019 are comparing it to the libraries listed below
Sorting:
- 面向金融领域的事件主体抽取(ccks2019),一个baseline☆119Updated 6 years ago
- 2019 语言与智能技术竞赛-知识驱动对话 B榜第5名源码和模型☆25Updated 5 years ago
- Adversarial Attack文本匹配比赛☆42Updated 5 years ago
- 基于BERT的中文命名实体识别(pytorch)☆18Updated 6 years ago
- CCL2018客服领域用户意图分类冠军1st方案☆150Updated 2 years ago
- 中文 预训练 ELECTRA 模型: 基于对抗学习 pretrain Chinese Model☆141Updated 5 years ago
- 2018年机器阅读理解技术竞赛总结,国内外1000多支队伍中BLEU-4评分排名第6, ROUGE-L评分排名第14。(未ensemble,未嵌入训练好的词向量,无dropout)☆30Updated 7 years ago
- baseline for ccks2019-ipre☆48Updated 5 years ago
- 2019达观杯 第六名代码☆44Updated 2 years ago
- 基于capsule的观点型阅读理解模型☆89Updated 6 years ago
- CCKS 2019 Task 2: Entity Recognition and Linking☆94Updated 6 years ago
- 2019搜狐校园算法大赛。决赛解决方案ppt、实体lgb单模代码☆71Updated 6 years ago
- 达观算法比赛ner任务,从重新训练bert,到finetune预测。☆75Updated 2 years ago
- XLNet: Generalized Autoregressive Pretraining for Language Understanding 论文的中文翻译 Paper Chinese Translation!☆49Updated 5 years ago
- CHIP2018问句匹配大赛 Rank6解决方案☆21Updated 6 years ago
- Code for Fine-grained Sentiment Analysis of User Reviews of AI Challenger 2018☆171Updated 5 years ago
- 这是使用pytoch 实现的长文本分类器☆46Updated 6 years ago
- datagrand 2019 information extraction competition rank9☆130Updated 5 years ago
- ☆44Updated 6 years ago
- use google pre-training model bert to fine-tuning for the chinese multiclass classification☆40Updated 6 years ago
- this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large☆65Updated 5 years ago
- 莱斯杯:全国第二届“军事智能机器阅读”挑战赛 - Rank7 解决方案☆40Updated 5 years ago
- 2020语言与智能技术竞赛:关系抽取任务(https://aistudio.baidu.com/aistudio/competition/detail/31?lang=zh_CN)☆24Updated 5 years ago
- CCF BDCI 金融信息负面及主体判定第三名方案☆55Updated 5 years ago
- 2019年百度的实体链指比赛(ccks2019),一个baseline☆112Updated 5 years ago
- Kaggle新赛(baseline)-基于BERT的fine-tuning方案+基于tensor2tensor的Transformer Encoder方案☆61Updated 6 years ago
- ☆61Updated 5 years ago
- 使用BERT解决lic2019机器阅读理解☆90Updated 6 years ago
- 中文预训练模型生成字向量学习,测试BERT,ELMO的中文效果☆100Updated 5 years ago
- Rank2 solution (no-BERT) for 2019 Language and Intelligence Challenge - DuReader2.0 Machine Reading Comprehension.☆128Updated 5 years ago