wind91725 / gpt2-ml-finetune-
根据gpt2-ml中文模型finetune自己的数据集
☆43Updated last year
Alternatives and similar repositories for gpt2-ml-finetune-:
Users that are interested in gpt2-ml-finetune- are comparing it to the libraries listed below
- lasertagger-chinese;lasertagger中文学习案例,案例数据,注释,shell运行☆75Updated last year
- ☆73Updated 5 years ago
- Chinese GPT2: pre-training and fine-tuning framework for text generation☆188Updated 3 years ago
- Chinese Transformer Generative Pre-Training Model☆59Updated 5 years ago
- 这是一个用于解决生成在生成任务中(翻译,复述等等),多样性不足问题的模型。☆45Updated 5 years ago
- Pytorch model for https://github.com/imcaspar/gpt2-ml☆79Updated 3 years ago
- ☆102Updated 4 years ago
- Unilm for Chinese Chitchat Robot.基于Unilm模型的夸夸式闲聊机器人项目。☆157Updated 4 years ago
- 中文生成式预训练模型☆98Updated 4 years ago
- 中文版unilm预训练模型☆83Updated 4 years ago
- A Sentence Cloze Dataset for Chinese Machine Reading Comprehension (CMRC 2019)☆126Updated 2 years ago
- ☆89Updated 4 years ago
- Investigating Prior Knowledge for Challenging Chinese Machine Reading Comprehension☆166Updated 2 years ago
- Chinese Language Generation Evaluation 中文生成任务基准测评☆246Updated 4 years ago
- 基于transformer的指针生成网络☆91Updated 4 years ago
- 在bert4keras下加载CPM_LM模型☆51Updated 4 years ago
- 中文预训练XLNet模型: Pre-Trained Chinese XLNet_Large☆229Updated 5 years ago
- 整理一下在keras中使用T5模型的要点☆172Updated 2 years ago
- Bert finetune for CMRC2018, CJRC, DRCD, CHID, C3☆182Updated 4 years ago
- 基于预训练模型 BERT 的阅读理解☆92Updated last year
- Collections of Chinese reading comprehension datasets☆215Updated 5 years ago
- Finetune CPM-1☆75Updated last year
- 用bert4keras加载CDial-GPT☆38Updated 4 years ago
- ☆155Updated 3 years ago
- Modify Chinese text, modified on LaserTagger Model. I name it "文本手术刀".目前,本项目实现了一个文本复述任务,用于NLP语料的数据增强。☆210Updated last year
- 使用BERT解决lic2019机器阅读理解☆89Updated 5 years ago
- 吹逼我们是认真的☆44Updated 2 years ago
- 一条命令产生bert、albert句向量,用于相似度计算和文本 分类等。☆34Updated 2 years ago
- 槽填充、意图预测(口语理解)论文整理和中文翻译。Slot filling and intent prediction paper collation and Chinese translation.☆48Updated 5 years ago
- this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large☆65Updated 4 years ago