jm12138 / CPM-Generate-Pytorch
☆37Updated 4 years ago
Alternatives and similar repositories for CPM-Generate-Pytorch
Users that are interested in CPM-Generate-Pytorch are comparing it to the libraries listed below
Sorting:
- Finetune CPM-1☆74Updated 2 years ago
- ☆101Updated 4 years ago
- 在bert4keras下加载CPM_LM模型☆51Updated 4 years ago
- 对话改写介绍文章☆97Updated last year
- Code for CPM-2 Pre-Train☆158Updated 2 years ago
- 中文版unilm预训练模型☆83Updated 4 years ago
- Pytorch model for https://github.com/imcaspar/gpt2-ml☆79Updated 3 years ago
- NLU & NLG (zero-shot) depend on mengzi-t5-base-mt pretrained model☆74Updated 2 years ago
- Finetune CPM-2☆82Updated 2 years ago
- A framework for cleaning Chinese dialog data☆269Updated 4 years ago
- 基于“Seq2Seq+前缀树”的知识图谱问答☆71Updated 3 years ago
- Investigating Prior Knowledge for Challenging Chinese Machine Reading Comprehension☆166Updated 3 years ago
- 整理一下在keras中使用T5模型的要点☆172Updated 3 years ago
- CCL 2022 汉语学习者文本纠错评测☆141Updated 2 years ago
- 用bert4keras加载CDial-GPT☆38Updated 4 years ago
- lasertagger-chinese;lasertagger中文学习案例,案例数据,注释,shell运行☆75Updated 2 years ago
- CLUEWSC2020: WSC Winograd模式挑战中文版,中文指代消解任务☆75Updated 4 years ago
- 基于mlm方式的带有纠错功能的拼音转汉字bert预训练模型,pinyin correcter,基于pytorch框架实现☆45Updated 4 years ago
- Python toolkit for Chinese Language Understanding(CLUE) Evaluation benchmark☆129Updated last year
- P-tuning方法在中文上的简单实验☆139Updated 4 years ago
- ☆89Updated 4 years ago
- Chinese Language Generation Evaluation 中文生成任务基准测评☆247Updated 4 years ago
- Introduction to CPM☆165Updated 3 years ago
- ☆127Updated 2 years ago
- Pattern-Exploiting Training在中文上的简单实验☆171Updated 4 years ago
- ☆75Updated 5 years ago
- Source code for the paper "PLOME: Pre-training with Misspelled Knowledge for Chinese Spelling Correction" in ACL2021☆235Updated 2 years ago
- 真 · “Deep Learning for Humans”☆141Updated 3 years ago
- SIGHAN中文纠错数据集及转换后格式☆64Updated 5 years ago
- This is the official code for paper titled "Exploration and Exploitation: Two Ways to Improve Chinese Spelling Correction Models".☆68Updated 3 years ago