murray-z / multi_label_classification
基于pytorch + bert的多标签文本分类(multi label text classification)
☆101Updated last year
Alternatives and similar repositories for multi_label_classification:
Users that are interested in multi_label_classification are comparing it to the libraries listed below
- bert pytorch模型微调用于的多标签文本分类☆128Updated 5 years ago
- 基于pytorch_bert的中文多标签分类☆88Updated 3 years ago
- 基于词汇信息融合的中文NER模型☆164Updated 2 years ago
- 基于pytorch+bert的中文文本分类☆82Updated last year
- 超长文本分类(大于1000字);文档级/篇章级文本分类;主要是解决长距离依赖问题☆126Updated 3 years ago
- 基于Pytorch的命名实体识别框架,支持LSTM+CRF、Bert+CRF、RoBerta+CRF等框架☆82Updated last year
- CMeEE/CBLUE/NER实体识别☆125Updated 2 years ago
- 基于Pytorch的文本分类框架,支持TextCNN、Bert、Electra等。☆61Updated last year
- Implemention of NER model on chinese dataset.☆70Updated last year
- 本项目采用Keras和Keras-bert实现文本多标签分类任务,对BERT进行微调。☆66Updated 3 years ago
- 全局指针统一处理嵌套与非嵌套NER的Pytorch实现☆384Updated last year
- 基于pytorch+bert的中文事件抽取☆70Updated 2 years ago
- 中文二分类,bert+TextCNN 两种实现方法☆21Updated 2 years ago
- experiments of some semantic matching models and comparison of experimental results.☆161Updated last year
- pytorch实现 基于Bert+BiLSTM+CRF的中文命名实体识别☆42Updated 3 years ago
- 利用指针网络进行信息抽取,包含命名实体识别、关系抽取、事件抽取。☆123Updated last year
- 嵌套命名实体识别 Nested NER☆19Updated 3 years ago
- Pytorch进行长文本分类。这里用到的网络有:FastText、TextCNN、TextRNN、TextRCNN、Transformer☆46Updated 4 years ago
- GPLinker_pytorch☆80Updated 2 years ago
- 多模型中文cnews新闻文本分类☆53Updated 4 years ago
- 利用bert和textcnn解决多标签文本分类的demo。☆31Updated 2 years ago
- SimCSE中文语义相似度对比学习模型☆81Updated 2 years ago
- OneRel在中文关系抽取中的使用☆118Updated last year
- GlobalPointer的优化版/NER实体识别☆114Updated 3 years ago
- 基于prompt的中文文本分类。☆54Updated last year
- 文本相似度,语义向量,文本向量,text-similarity,similarity, sentence-similarity,BERT,SimCSE,BERT-Whitening,Sentence-BERT, PromCSE, SBERT☆73Updated 2 months ago
- TPlinker for NER 中文/英文命名实体识别☆122Updated 3 years ago
- 面向金融领域的篇章级事件抽取和事件因果关系抽取 第六名 方案及代码☆61Updated 2 years ago
- NLP句子编码、句子embedding、语义相似度:BERT_avg、BERT_whitening、SBERT、SmiCSE☆175Updated 3 years ago
- ☆277Updated 2 years ago