clue-ai / PromptCLUE
PromptCLUE, 全中文任务支持零样本学习模型
☆655Updated last year
Related projects ⓘ
Alternatives and complementary repositories for PromptCLUE
- pCLUE: 1000000+多任务提示学习数据集☆469Updated 2 years ago
- Mengzi Pretrained Models☆534Updated last year
- TextGen: Implementation of Text Generation models, include LLaMA, BLOOM, GPT2, BART, T5, SongNet and so on. 文本生成模型,实现了包括LLaMA,ChatGLM,BLO…☆935Updated 2 months ago
- 探索中文instruct数据在ChatGLM, LLaMA上的微调表现☆388Updated last year
- ChatGLM-6B 指令学习|指令数据|Instruct☆653Updated last year
- SimBERT升级版(SimBERTv2)!☆438Updated 2 years ago
- [COLING 2022] CSL: A Large-scale Chinese Scientific Literature Dataset 中文科学文献数据集☆569Updated last year
- Large-scale Pre-training Corpus for Chinese 100G 中文预训练语料☆925Updated 2 years ago
- unified embedding model☆833Updated last year
- 比Sentence-BERT更有效的句向量方案☆359Updated 2 years ago
- 语言模型中文认知能力分析☆235Updated last year
- Collections of resources from Joint Laboratory of HIT and iFLYTEK Research (HFL)☆364Updated last year
- alpaca中文指令微调数据 集☆391Updated last year
- a bert for retrieval and generation☆846Updated 3 years ago
- ChatGLM2-6B 全参数微调,支持多轮对话的高效微调。☆395Updated last year
- PaddleNLP UIE模型的PyTorch版实现☆599Updated last year
- 3000000+语义理解与匹配数据集。可用于无监督对比学习、半监督学习等构建中文领域效果最好的预训练模型☆280Updated 2 years ago
- 人工精调的中文对话数据集和一段chatglm的微调代码☆1,158Updated 6 months ago
- 中文自然语言推理数据集(A large-scale Chinese Nature language inference and Semantic similarity calculation Dataset)☆425Updated 4 years ago
- ☆297Updated last year
- MiniRBT (中文小型预训练模型系列)☆253Updated last year
- A framework for cleaning Chinese dialog data☆261Updated 3 years ago
- CPT: A Pre-Trained Unbalanced Transformer for Both Chinese Language Understanding and Generation☆481Updated last year
- 中文生成式预训练模型☆555Updated 2 years ago
- The online version is temporarily unavailable because we cannot afford the key. You can clone and run it locally. Note: we set defaul ope…☆790Updated 5 months ago
- KgCLUE: 大规模中文开源知识图谱问答☆429Updated 2 years ago
- Revisiting Pre-trained Models for Chinese Natural Language Processing (MacBERT)☆645Updated last year
- 高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型☆805Updated 4 years ago
- 使用peft库,对chatGLM-6B/chatGLM2-6B实现4bit的QLoRA高效微调,并做lora model和base model的merge及4bit的量化(quantize)。☆356Updated last year
- Firefly中文LLaMA-2大模型,支持增量预训练Baichuan2、Llama2、Llama、Falcon、Qwen、Baichuan、InternLM、Bloom等大模型☆397Updated last year