Macielyoung / Bloom-Lora
Finetune Bloom big language model with Lora method
☆28Updated last year
Alternatives and similar repositories for Bloom-Lora:
Users that are interested in Bloom-Lora are comparing it to the libraries listed below
- 百川Dynamic NTK-ALiBi的代码实现:无需微调即可推理更长文本☆47Updated last year
- 中文 Instruction tuning datasets☆126Updated 10 months ago
- ChatGLM-6B fine-tuning.☆135Updated last year
- 零样本学习测评基准,中文版☆54Updated 3 years ago
- NTK scaled version of ALiBi position encoding in Transformer.☆67Updated last year
- Ongoing research training transformer language models at scale, including: BERT & GPT-2☆19Updated last year
- 文本去重☆68Updated 8 months ago
- ChatGLM2-6B微调, SFT/LoRA, instruction finetune☆105Updated last year
- Simple implementation of using lora form the peft library to fine-tune the chatglm-6b☆85Updated last year
- LORA微调BLOOMZ,参考BELLE☆25Updated last year
- use chatGLM to perform text embedding☆45Updated last year
- GoGPT:基于Llama/Llama 2训练的中英文增强大模型|Chinese-Llama2☆78Updated last year
- RoFormer升级版☆152Updated 2 years ago
- CTC2021-中文文本纠错大赛的SOTA方案及在线演示☆72Updated last year
- NLU & NLG (zero-shot) depend on mengzi-t5-base-mt pretrained model☆74Updated 2 years ago
- 中文图书语料MD5链接☆213Updated last year
- ☆27Updated last year
- moss chat finetuning☆50Updated 9 months ago
- GTS Engine: A powerful NLU Training System。GTS引擎(GTS-Engine)是一款开箱即用且性能强大的自然语言理解引擎,聚焦于小样本任务,能够仅用小样本就能自动化生产NLP模型。☆91Updated last year
- SuperCLUE-Math6:新一代中文原生多轮多步数学推理数据集的探索之旅☆53Updated last year
- chatglm-6b微调/LORA/PPO/推理, 样本为自动生成的整数/小数 加减乘除运算, 可gpu/cpu☆164Updated last year
- 大规模中文语料☆40Updated 5 years ago
- 中文大语言模型评测第一期☆108Updated last year
- 基于模板的文本纠错;Automatically Mining Error Templates for Grammatical Error Correction☆38Updated 2 years ago
- text embedding☆144Updated last year
- ☆173Updated last year
- 中文bigbird预训练模型☆91Updated 2 years ago
- ☆304Updated last year
- deep training task☆29Updated last year
- CLUEWSC2020: WSC Winograd模式挑战中文版,中文指代消解任务☆72Updated 4 years ago