ZephyrChenzf / text-classification-pytorchLinks
使用pytorch和京东某商品评价数据集,采用不同模型对文本进行分类
☆25Updated 7 years ago
Alternatives and similar repositories for text-classification-pytorch
Users that are interested in text-classification-pytorch are comparing it to the libraries listed below
Sorting:
- 面向金融领域的事件主体抽取(ccks2019),一个baseline☆119Updated 6 years ago
- 关于文本分类的许多方法,主要涉及到TextCNN,TextRNN, LEAM, Transformer,Attention, fasttext, HAN等☆76Updated 6 years ago
- 参考NER,基于BERT的电商评论观点挖掘和情感分析☆42Updated 6 years ago
- 汽车主题情感分析大赛冠军☆27Updated 6 years ago
- 使用分层注意力机制 HAN + 多任务学习 解决 AI Challenger 细粒度用户评论情感分析 。https://challenger.ai/competition/fsauor2018☆58Updated 6 years ago
- 蚂蚁金服比赛 15th/2632☆48Updated 6 years ago
- 达观算法比赛ner任务,从重新训练bert,到finetune预测。☆75Updated 2 years ago
- 关键词抽取,神策杯2018高校算法大师赛比赛,solo 排名3/591☆65Updated 6 years ago
- ☆32Updated 6 years ago
- 搜狐校园算法大赛baseline☆66Updated 6 years ago
- AI-Challenger Baseline 细粒度用户评论情感分析☆230Updated 6 years ago
- 2019百度语言与智能技术竞赛信息抽取赛代5名代码☆69Updated 6 years ago
- 2019年4月8日,第三届搜狐校园内容识别算法大赛。☆25Updated 6 years ago
- 中文预训练模型生成字向量学习,测试BERT,ELMO的中文效果☆100Updated 5 years ago
- NLP相关的paper代码复现。主要包括ACL,AAAI,EMNLP等顶会论文。☆87Updated 3 years ago
- Adversarial Attack文本匹配比赛☆42Updated 5 years ago
- BiLSTM+CRF by Pytorch and classic CRF by pysuite 基于双向循环神经网络和CRF特征模板的信息抽取☆17Updated 6 years ago
- 面向金融领域的实体关系抽取☆51Updated 6 years ago
- BERT-BiLSTM-CRF的Keras版实现☆41Updated 6 years ago
- Bert-classification and bert-dssm implementation with keras.☆94Updated 5 years ago
- PyTorch implement of BiLSTM-CRF for Chinese NER☆61Updated 6 years ago
- seq2seq+attention model for Chinese textsum☆42Updated 7 years ago
- pytorch用Textcnn-bilstm-crf模型实现命名实体识别☆42Updated 7 years ago
- 细粒度用户评论情感分析☆123Updated 6 years ago
- CCL2018客服领域用户意图分类冠军1st方案☆150Updated 2 years ago
- BERT预训练模型字向量提取工具☆53Updated 5 years ago
- 基于siamese-lstm的中文句子相似度计算☆130Updated 7 years ago
- 2019达观杯 第六名代码☆44Updated 2 years ago
- 2019搜狐校园算法大赛。决赛解决方案ppt、实体lgb单模代码☆71Updated 6 years ago
- CSDN博客的关键词提取算法,融合TF,IDF,词性,位置等多特征。该项目用于参加2017 SMP用户画像测评,排名第四,在验证集中精度为59.9%,在最终集中精度为58.7%。启发式的方法,通用性强。☆30Updated 7 years ago