SunnyGJing / t5-pegasus-chinese
基于GOOGLE T5中文生成式模型的摘要生成/指代消解,支持batch批量生成,多进程
☆222Updated last year
Alternatives and similar repositories for t5-pegasus-chinese:
Users that are interested in t5-pegasus-chinese are comparing it to the libraries listed below
- ☆414Updated last year
- 中文生成式预训练模型☆565Updated 3 years ago
- 全局指针统一处理嵌套与非嵌套NER的Pytorch实现☆392Updated 2 years ago
- 端到端的长本文摘要模型(法研杯2020司法摘要赛道)☆395Updated 10 months ago
- SimCSE在中文上的复现,有监督+无监督☆275Updated last month
- ☆278Updated 2 years ago
- ☆267Updated 8 months ago
- pytorch中文语言模型预训练☆390Updated 4 years ago
- experiments of some semantic matching models and comparison of experimental results.☆161Updated last year
- SimBERT升级版(SimBERTv2)!☆441Updated 3 years ago
- CPT: A Pre-Trained Unbalanced Transformer for Both Chinese Language Understanding and Generation☆487Updated 2 years ago
- 中文自然语言推理数据集(A large-scale Chinese Nature language inference and Semantic similarity calculation Dataset)☆428Updated 5 years ago
- 新闻标题摘要生成模型,基于T5-PEGASUS。News title summary generation model☆18Updated 2 years ago
- 超长文本分类(大于1000字);文档级/篇章级文本分类;主要是解决长距离依赖问题☆130Updated 3 years ago
- text correction papers☆303Updated last year
- 中文数据集下SimCSE+ESimCSE的实现☆191Updated 2 years ago
- CoSENT、STS、SentenceBERT☆166Updated 2 months ago
- Modify Chinese text, modified on LaserTagger Model. 文本复述,基于lasertagger做中文文本数据增强。☆318Updated last year
- 中文自然语言推理与语义相似度数据集☆346Updated 3 years ago
- 以词为基本单位的中文BERT☆466Updated 3 years ago
- 全局指针统一处理嵌套与非嵌套NER☆253Updated 3 years ago
- SimCSE在中文任务上的简单实验☆604Updated last year
- NLP句子编码、句子embedding、语义相似度:BERT_avg、BERT_whitening、SBERT、SmiCSE☆176Updated 3 years ago
- chinese bertsum ; bertsum 抽取式模型中文版本;给出案例数据、全代码注释;下载即可训练、预测、学习☆204Updated last year
- 基于GPT2的中文摘要生成模型☆409Updated last year
- 基于SpanBert的中文指代消解,pytorch实现☆97Updated 2 years ago
- 提供一款中文版生成式摘要服务☆338Updated 3 weeks ago
- 本人项目进行中搜集的数据集,包含原始数据和经过处理后的数据,项目持续更新。☆113Updated 4 years ago
- Code for the ACL2021 paper "Lexicon Enhanced Chinese Sequence Labelling Using BERT Adapter"☆341Updated 3 years ago
- Source code for the paper "PLOME: Pre-training with Misspelled Knowledge for Chinese Spelling Correction" in ACL2021☆235Updated 2 years ago