CLUEbenchmark / ZeroCLUE
零样本学习测评基准,中文版
☆54Updated 3 years ago
Alternatives and similar repositories for ZeroCLUE:
Users that are interested in ZeroCLUE are comparing it to the libraries listed below
- 百川Dynamic NTK-ALiBi的代码实现:无需微调即可推理更长文本☆47Updated last year
- NLU & NLG (zero-shot) depend on mengzi-t5-base-mt pretrained model☆74Updated 2 years ago
- ☆53Updated 2 years ago
- CTC2021-中文文本纠错大赛的SOTA方案及在线演示☆72Updated last year
- CLUEWSC2020: WSC Winograd模式挑战中文版,中文指代消解任务☆72Updated 4 years ago
- 对话改写介绍文章☆95Updated last year
- OPD: Chinese Open-Domain Pre-trained Dialogue Model☆74Updated last year
- 中文版unilm预训练模型☆83Updated 3 years ago
- 基于“Seq2Seq+前缀树”的知识图谱问答☆70Updated 3 years ago
- Source code for ACL 2023 paper Decoder Tuning: Efficient Language Understanding as Decoding☆48Updated last year
- Dataset for Findings of ACL 23 "VCSum: A Versatile Chinese Meeting Summarization Dataset"☆33Updated last year
- NTK scaled version of ALiBi position encoding in Transformer.☆67Updated last year
- 用bert4keras加载CDial-GPT☆38Updated 4 years ago
- source code of EMNLP2021: A Lightweight Pretrained Model for Chinese Spelling Check☆14Updated 3 years ago
- OCNLI: 中文原版自然语言推理任务☆150Updated 3 years ago
- (NBCE)Naive Bayes-based Context Extension on ChatGLM-6b☆14Updated last year
- lasertagger-chinese;lasertagger中文学习案例,案例数据,注释,shell运行☆75Updated last year
- 中文bigbird预训练模型☆91Updated 2 years ago
- 在bert4keras下加载CPM_LM模型☆51Updated 4 years ago
- 时间抽取、解析、标准化工具☆50Updated 2 years ago
- LORA微调BLOOMZ,参考BELLE☆25Updated last year
- The Corpus & Code for EMNLP 2022 paper "FCGEC: Fine-Grained Corpus for Chinese Grammatical Error Correction" | FCGEC中文语法纠错语料及STG模型☆110Updated 2 months ago
- ☆89Updated 4 years ago
- 记录NLP、CV、搜索、推荐等AI岗位最新情况。☆28Updated last year
- 中文大语言模型评测第二期☆70Updated last year
- Python toolkit for Chinese Language Understanding(CLUE) Evaluation benchmark☆128Updated last year
- 中文大语言模型评测第一期☆108Updated last year
- ☆59Updated last year
- A large-scale cleaned Chinese chitchat corpus and Chinese dialogpt models☆34Updated 4 years ago
- this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large☆65Updated 4 years ago