PolarisRisingWar / text_summarization_chinese
各大文本摘要模型-中文文本可运行的解决方案
☆67Updated last year
Alternatives and similar repositories for text_summarization_chinese:
Users that are interested in text_summarization_chinese are comparing it to the libraries listed below
- experiments of some semantic matching models and comparison of experimental results.☆161Updated last year
- 基于pytorch + bert的多标签文本分类(multi label text classification)☆103Updated last year
- ☆74Updated 5 years ago
- ☆87Updated 3 years ago
- 新闻标题摘要生成模型,基于T5-PEGASUS。News title summary generation model☆18Updated 2 years ago
- 基于SpanBert的中文指代消解,pytorch实现☆97Updated 2 years ago
- 使用谷歌2020pegasus模型进行中文文档摘要☆26Updated 2 years ago
- This is some summary code and model☆39Updated 3 years ago
- 中文机器阅读理解数据集☆103Updated 4 years ago
- 句子匹配模型,包括无监督的SimCSE、ESimCSE、PromptBERT,和有监督的SBERT、CoSENT。☆98Updated 2 years ago
- 基于GOOGLE T5中文生成式模型的摘要生成/指代消解,支持batch批量生成,多进程☆222Updated last year
- 中文无监督SimCSE Pytorch实现☆134Updated 3 years ago
- bert pytorch模型微调用于的多标签文本分类☆131Updated 5 years ago
- 法研杯2021类案检索赛道三等奖方案☆50Updated 3 years ago
- 基于词汇信息融合的中文NER模型☆166Updated 3 years ago
- 中文bigbird预训练模型☆91Updated 2 years ago
- ☆414Updated last year
- ☆136Updated 3 years ago
- NLP句子编码、句子embedding、语义相似度:BERT_avg、BERT_whitening、SBERT、SmiCSE☆176Updated 3 years ago
- CoSENT、STS、SentenceBERT☆166Updated 2 months ago
- Source code for the paper "PLOME: Pre-training with Misspelled Knowledge for Chinese Spelling Correction" in ACL2021☆235Updated 2 years ago
- 基于Transformer的生成式文本摘要☆181Updated 2 years ago
- CCL 2022 汉语学习者文本纠错评测☆138Updated 2 years ago
- Chinese Language Generation Evaluation 中文生成任务基准测评☆245Updated 4 years ago
- chinese bertsum ; bertsum 抽取式模型中文版本;给出案例数据、全代码注释;下载即可训练、预测、学习☆204Updated last year
- GlobalPointer的优化版/NER实体识别☆118Updated 3 years ago
- Source code and dataset for ACL2022 Findings Paper "LEVEN: A Large-Scale Chinese Legal Event Detection dataset"☆110Updated last year
- 基于prompt的中文文本分类。☆55Updated last year
- 基于pytorch的百度UIE命名实体识别。☆57Updated 2 years ago
- 利用huggingface实现文本分类☆58Updated 3 years ago