deepdialog / CPM-LM-TF2Links
☆247Updated 3 years ago
Alternatives and similar repositories for CPM-LM-TF2
Users that are interested in CPM-LM-TF2 are comparing it to the libraries listed below
Sorting:
- GPT2 training script for Chinese in Tensorflow 2.0☆153Updated 4 years ago
- Pytorch model for https://github.com/imcaspar/gpt2-ml☆78Updated 4 years ago
- Easy-to-use CPM for Chinese text generation(基于CPM的中文文本生成)☆534Updated 2 years ago
- 根据gpt2-ml中文模型finetune自己的数据集☆44Updated 2 years ago
- ☆442Updated 7 months ago
- Code for CPM-2 Pre-Train☆158Updated 2 years ago
- ☆37Updated 4 years ago
- ☆442Updated 3 years ago
- 在bert4keras下加载CPM_LM模型☆51Updated 5 years ago
- A Span-Extraction Dataset for Chinese Machine Reading Comprehension (CMRC 2018)☆444Updated 3 years ago
- 高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型☆817Updated 5 years ago
- ☆220Updated 3 years ago
- 整理一下在keras中使用T5模型的要点☆174Updated 3 years ago
- Poetry-related datasets developed by THUAIPoet (Jiuge) group.☆232Updated 5 years ago
- KdConv: A Chinese Multi-domain Dialogue Dataset Towards Multi-turn Knowledge-driven Conversation☆495Updated 2 years ago
- 速度更快、效果更好的中文新词发现☆514Updated last year
- Large-scale Pre-training Corpus for Chinese 100G 中文预训练语料☆992Updated 3 years ago
- ☆51Updated 4 years ago
- 基于“音形码”的中文字符串相似度计算方法☆227Updated 5 years ago
- ☆102Updated 5 years ago
- Modify Chinese text, modified on LaserTagger Model. 文本复述,基于lasertagger做中文文本数据增强。☆324Updated last year
- rasa_chinese 专门针对中文语言的 rasa 组件扩展包,提供了许多针对中文语言的组件☆152Updated 2 years ago
- Mengzi Pretrained Models☆539Updated 3 years ago
- transformer xl在中文文本生成上的尝试(可写小说、古诗)(transformer xl for text generation of chinese)☆728Updated 3 years ago
- 基于GPT2的中文摘要生成模型☆405Updated 2 years ago
- SimBERT升级版(SimBERTv2)!☆445Updated 3 years ago
- Unilm for Chinese Chitchat Robot.基于Unilm模型的夸夸式闲聊机器人项目。☆158Updated 4 years ago
- Open Language Pre-trained Model Zoo☆1,005Updated 4 years ago
- 中文生成式预训练模型☆569Updated 3 years ago
- Chinese, English NER, English-Chinese machine translation dataset. 中英文实体识别数据集,中英文机器翻译数据集, 中文分词数据集☆369Updated 4 years ago