CLUEbenchmark / LightLM
高性能小模型测评 Shared Tasks in NLPCC 2020. Task 1 - Light Pre-Training Chinese Language Model for NLP Task
☆57Updated 4 years ago
Related projects ⓘ
Alternatives and complementary repositories for LightLM
- ☆59Updated 5 years ago
- tensorflow version of bert-of-theseus☆63Updated 3 years ago
- bert-of-theseus via bert4keras☆31Updated 4 years ago
- this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large☆65Updated 4 years ago
- 24*2个预训练的小型BERT模型,NLPer炼丹利器☆51Updated 4 years ago
- 中文 预训练 ELECTRA 模型: 基于对抗学习 pretrain Chinese Model☆140Updated 4 years ago
- 分享一些S2S在实际应用中遇到的问题和解决方法。☆27Updated 4 years ago
- pytorch版bert权重转tf☆21Updated 4 years ago
- 高质量闲聊数据介绍☆29Updated 5 years ago
- 无监督文本生成的一些方法☆49Updated 3 years ago
- XLNet: Generalized Autoregressive Pretraining for Language Understanding 论文的中文翻译 Paper Chinese Translation!☆50Updated 5 years ago
- DistilBERT for Chinese 海量中文预训练蒸馏bert模型☆90Updated 4 years ago
- Chinese Version of ACL 2020 PC Blogs (ACL 2020程序委员会博文中文版)☆14Updated 4 years ago
- Python toolkit for Chinese Language Understanding(CLUE) Evaluation benchmark☆128Updated last year
- 离线端阅读理解应用 QA for mobile, Android & iPhone☆60Updated 2 years ago
- Subword Encoding in Lattice LSTM for Chinese Word Segmentation☆54Updated 5 years ago
- codes for ai challenger 2018 machine reading comprehension☆27Updated 5 years ago
- A large-scale cleaned Chinese chitchat corpus and Chinese dialogpt models☆34Updated 4 years ago
- Dataset for CIKM 2018 paper "Multi-Source Pointer Network for Product Title Summarization"☆73Updated 6 years ago
- UNF(Universal NLP Framework)☆70Updated 4 years ago
- ☆48Updated 3 years ago
- ☆89Updated 4 years ago
- modification of official bert for downstream task☆31Updated last year
- Dataset and Baseline for SMP-MCC2020☆23Updated last year
- 2020语言与智能技术竞赛:面向推荐的对话任务☆51Updated 3 years ago
- kenlm语言模型,并提供python的rest服务☆29Updated 6 years ago
- EasyTransfer is designed to make the development of transfer learning in NLP applications easier.☆8Updated 4 years ago