linzzzzzz / Chinese-Text-Classification-Pytorch-Tuning
中文文本分类,Bert,TextCNN,TextRNN,FastText,TextRCNN,BiLSTM_Attention,DPCNN,Transformer,基于pytorch,开箱即用。
☆30Updated 2 years ago
Related projects ⓘ
Alternatives and complementary repositories for Chinese-Text-Classification-Pytorch-Tuning
- 文本分类baseline:BERT、半监督学习UDA、对抗学习、数据增强☆98Updated 3 years ago
- 超长文本分类(大于1000字);文档级/篇章级文本分类;主要是解决长距离依赖问题☆119Updated 3 years ago
- 存放知乎,博客发表文章中的代码☆45Updated 3 years ago
- 基于pytorch_bert的中文多标签分类☆82Updated 3 years ago
- SimCSE中文语义相似度对比学习模型☆78Updated 2 years ago
- THUCNews中文文本分类数据集,该数据集包含84万篇新闻文档,总计14类;在该模型的基础上测试多个版本bert分类效果。☆52Updated 3 years ago
- 多模型中文cnews新闻文本分类☆52Updated 4 years ago
- NLP文本增强的两种方式:同义词替换(利用word2vec词表)和回译☆71Updated 3 years ago
- 中文数据集下SimCSE+ESimCSE的实现☆189Updated 2 years ago
- Summary and comparison of Chinese classification models☆34Updated 2 years ago
- 基于prompt的中文文本分类。☆55Updated last year
- smp ewect code☆75Updated 4 years ago
- 本仓主要实现并解决基于预训练bert,预训练字向量和词向量实现Bert_RCNN用于一个长文本对应对个标签的问题☆17Updated 3 years ago
- 疫情期间网民情绪识别比赛分享+top1~3解决方案☆51Updated 4 years ago
- 利用bert预训练模型生成句向量或词向量☆28Updated 4 years ago
- 基于pytorch + bert的多标签文本分类(multi label text classification)☆91Updated last year
- Chinese-Text-Classification Project including bert-classification, textCNN and so on.☆145Updated 2 years ago
- PyTorch使用BERT进行英语多标签文本分类☆32Updated 2 years ago
- ☆87Updated 3 years ago
- Pytorch进行长文本分类。这里用到的网络有:FastText、TextCNN、TextRNN、TextRCNN、Transformer☆45Updated 4 years ago
- NER任务SOTA模型BERT_MRC☆60Updated 8 months ago
- 利用bert和textcnn解决多标签文本分类的demo。☆29Updated 2 years ago
- bert pytorch模型微调用于的多标签文本分类☆125Updated 5 years ago
- ☆277Updated 2 years ago
- 利用传统方法(N-gram,HMM等)、神经网络方法(CNN,LSTM等)和预训练方法(Bert等)的中文分词任务实现【The word segmentation task is realized by using traditional methods (n-gram, …☆32Updated 2 years ago
- Sentence Classification Model Implemented with PyTorch☆16Updated 2 years ago
- Pytorch Bert+BiLstm二分类☆37Updated 3 years ago
- Sentiment Analysis-Pytorch(情感分析的Pytorch实现)☆58Updated 4 years ago
- CMeEE/CBLUE/NER实体识别☆124Updated 2 years ago