LianxinRay / bert_wwm_ngram_masking_of_chinese
chinese wwm masking and ngram masking based on jieba
☆11Updated 5 years ago
Alternatives and similar repositories for bert_wwm_ngram_masking_of_chinese:
Users that are interested in bert_wwm_ngram_masking_of_chinese are comparing it to the libraries listed below
- this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large☆65Updated 4 years ago
- chinese pretrain unilm☆28Updated 4 years ago
- Data Augmentation with a Generation Approach for Low-resource Tagging Tasks☆80Updated 4 years ago
- CGED & CSC☆22Updated 4 years ago
- This repo contains some experiments of text matching on Chinese dataset LCQMC☆27Updated 5 years ago
- 5st place solution for competition Duplication Question Detection based on Adversarial Attack☆39Updated 4 years ago
- 全球人工智能技术创新大赛-赛道三:小布助手对话短文本语义匹配☆37Updated 3 years ago
- Chinese Machine Reading 2021海华AI挑战赛·中文阅读理解·技术组·第三名☆21Updated 3 years ago
- ☆127Updated 2 years ago
- 关键词抽取项目☆24Updated 4 years ago
- ☆11Updated 4 years ago
- 机器检索阅读联合学习,莱斯杯:全国第二届“军事智能机器阅读”挑战赛 rank6 方案☆127Updated 4 years ago
- NLP中文预训练模型泛化能力挑战赛(https://tianchi.aliyun.com/competition/entrance/531841/introduction?spm=5176.12281957.1004.2.7a883eafYhYhOq)☆32Updated 4 years ago
- The enhanced RCNN model used for sentence similarity classification☆43Updated 3 years ago
- PyTorch version for Sequential Matching Network☆21Updated 5 years ago
- 对话改写介绍文章☆95Updated last year
- pytorch版unilm模型☆26Updated 3 years ago
- 不用tensorflow estimator,分别采用字mask和wwm mask在中文领域内finetune bert模型☆23Updated 4 years ago
- 中文版unilm预训练模型☆83Updated 4 years ago
- ☆34Updated 3 years ago
- NLP中文预训练模型泛化能力挑战赛☆42Updated 4 years ago
- 2019 语言与智能技术竞赛-知识驱动对话 B榜第5名源码和模型☆27Updated 5 years ago
- CTC2021-中文文本纠错大赛的SOTA方案及在线演示☆72Updated last year
- Dynamic Connected Networks for Chinese Spelling Check☆50Updated 10 months ago
- bert multiple gpu train pretrain☆29Updated 4 years ago
- 中文机器阅读理解数据集☆64Updated 5 years ago
- using lear to do ner extraction☆29Updated 2 years ago
- ☆88Updated 3 years ago
- 基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。☆58Updated 3 years ago
- 2021搜狐校园文本匹配算法大赛 分比我们低的都是帅哥队☆42Updated 3 years ago