wuyaqiang / NLPCC_EmotionDetection
(竞赛)NLPCC 会议测评任务,情绪识别
☆9Updated 5 years ago
Related projects ⓘ
Alternatives and complementary repositories for NLPCC_EmotionDetection
- Dataset from 'Character-based BiLSTM-CRF Incorporating POS and Dictionaries for Chinese Opinion Target Extraction'☆41Updated 6 years ago
- Pytorch implementation of "Character-based BiLSTM-CRF Incorporating POS and Dictionaries for Chinese Opinion Target Extraction", ACML2018☆56Updated 3 months ago
- 中国中文信息学会社会媒体处理专业委员会举办的2019届中文人机对话之自然语言理解竞赛☆74Updated 4 years ago
- 文本数据增强☆16Updated 4 years ago
- Bert中文文本分类☆40Updated 5 years ago
- CNN、BiLSTM、Bert(3layers)对Bert(12layers)模型的蒸馏的keras实现☆27Updated 4 years ago
- 中文 电商 电脑 手机 相机 槽填充 数据集☆12Updated 4 years ago
- 2020智源-京东多模态对话(JDDC2020)第三名解决方案分享☆41Updated 4 years ago
- PyTorch version for Sequential Matching Network☆21Updated 5 years ago
- 这是一个用于解决生成在生成任务中(翻译,复述等等),多样性不足问题的模型。☆45Updated 5 years ago
- 参考NER,基于BERT的电商评论观点挖掘和情感分析☆41Updated 5 years ago
- 2019 语言与智能技术竞赛-知识驱动对话 B榜第5名源码和模型☆27Updated 5 years ago
- 5st place solution for competition Duplication Question Detection based on Adversarial Attack☆39Updated 4 years ago
- 使用BERT解决lic2019机器阅读理解☆89Updated 5 years ago
- 基于BiLSTM和Self-Attention的文本分类、表示学习网络☆29Updated 5 years ago
- Chinese Named Entity Recognition Using Neural Network☆29Updated 2 years ago
- 天池疫情公益文本相似对比大 赛☆20Updated 4 years ago
- CGED & CSC☆22Updated 4 years ago
- 实现了一下multi-head-selection联合关系实体抽取☆31Updated 5 years ago
- 2020语言与智能技术竞赛:面向推荐的对话任务☆51Updated 3 years ago
- 关键词抽取项目☆24Updated 4 years ago
- 吹逼我们是认真的☆44Updated last year
- Paper notes: Linguistically Regularized LSTM for Sentiment Classification☆7Updated 6 years ago
- PyTorch实现的多标签的文本分类☆16Updated 5 years ago
- pytorch版的命名实体识别,LSTM和LSTM_CRF☆26Updated 5 years ago
- 两层attention 的lstm评论情感分析☆22Updated 6 years ago
- 法研杯CAIL2019阅读理解赛题参赛模型☆42Updated 5 years ago
- 这是一个seq2seq模型,编码器是bert,解码器是transformer的解码器,可用于自然语言处理中文本生成领域的任务☆72Updated 5 years ago
- 2020语言与智能技术竞赛:关系抽取任务(https://aistudio.baidu.com/aistudio/competition/detail/31?lang=zh_CN)☆25Updated 4 years ago
- this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large☆65Updated 4 years ago