baojunshan / nlp-fluency
评估自然语言的流畅度
☆110Updated 3 years ago
Related projects: ⓘ
- Source code for the paper "PLOME: Pre-training with Misspelled Knowledge for Chinese Spelling Correction" in ACL2021☆228Updated 2 years ago
- ☆256Updated last month
- experiments of some semantic matching models and comparison of experimental results.☆154Updated last year
- ☆126Updated last year
- This repository is for the paper "A Hybrid Approach to Automatic Corpus Generation for Chinese Spelling Check"☆284Updated 4 years ago
- 收集了目前为止中文领域的MRC抽取式数据集☆118Updated 3 months ago
- CCL 2022 汉语学习者文本纠错评测☆133Updated last year
- SIGHAN中文纠错数据集及转换后格式☆62Updated 4 years ago
- SimCSE在中文上的复现,有监督+无监督☆261Updated 2 years ago
- A Multi-modal Model Chinese Spell Checker Released on ACL2021.☆145Updated 11 months ago
- text correction papers☆284Updated 7 months ago
- 对话改写介绍文章☆96Updated last year
- CoSENT、STS、SentenceBERT☆161Updated last year
- SimBERT升级版(SimBERTv2)!☆437Updated 2 years ago
- code for ACL2021 paper "Tail-to-Tail Non-Autoregressive Sequence Prediction for Chinese Grammatical Error Correction"☆99Updated 2 years ago
- A framework for cleaning Chinese dialog data☆259Updated 3 years ago
- 基于SpanBert的中文指代消解,pytorch实现☆95Updated last year
- P-tuning方法在中文上的简单实验☆138Updated 3 years ago
- 历届中文句法错误诊断技术评测数据集☆33Updated 2 years ago
- 🙈 An unofficial implementation of SoftMaskedBert based on huggingface/transformers.☆93Updated 3 years ago
- 基于GOOGLE T5中文生成式模型的摘要生成/指代消解,支持batch批量生成,多进程☆214Updated 10 months ago
- Chinese NLP Data Augmentation, BERT Contextual Augmentation☆111Updated 2 years ago
- 3000000+语义理解与匹配数据集。可用于无监督对比学习、半监督学习等构建中文领域效果 最好的预训练模型☆274Updated last year
- Pattern-Exploiting Training在中文上的简单实验☆169Updated 3 years ago
- ACL 2019论文复现:Improving Multi-turn Dialogue Modelling with Utterance ReWriter☆129Updated 4 years ago
- ☆400Updated 6 months ago
- 中文NLP数据集☆151Updated 5 years ago
- ☆275Updated 2 years ago
- 一个基于预训练的句向量生成工具☆131Updated last year
- Modify Chinese text, modified on LaserTagger Model. 文本复述,基于lasertagger做中文文本数据增强。☆317Updated 8 months ago