zejunwang1 / gpt2classifierLinks
基于中文 GPT2 预训练模型的文本分类微调
☆24Updated 2 years ago
Alternatives and similar repositories for gpt2classifier
Users that are interested in gpt2classifier are comparing it to the libraries listed below
Sorting:
- LLM for NER☆80Updated last year
- llama,chatglm 等模型的微调☆91Updated last year
- 基于 pytorch 的 bert 实现和下游任务微调☆54Updated 3 years ago
- llama信息抽取实战☆101Updated 2 years ago
- 使用qlora对中文大语言模型进行微调,包含ChatGLM、Chinese-LLaMA-Alpaca、BELLE☆89Updated 2 years ago
- deepspeed+trainer简单高效实现多卡微调大模型☆130Updated 2 years ago
- 基于 LoRA 和 P-Tuning v2 的 ChatGLM-6B 高效参数微调☆55Updated 2 years ago
- 各大文本摘要模型-中文文本可运行的解决方案☆69Updated 2 years ago
- moss chat finetuning☆51Updated last year
- 实现了Baichuan-Chat微调,Lora、QLora等各种微调方式,一键运行。☆71Updated 2 years ago
- BLOOM 模型的指令微调☆24Updated 2 years ago
- 🌈 NERpy: Implementation of Named Entity Recognition using Python. 命名实体识别工具,支持BertSoftmax、BertSpan等模型,开箱即用。☆116Updated last year
- SimCSE中文语义相似度对比学习模型☆90Updated 3 years ago
- easy-bert是一个中文NLP工具,提供诸多bert变体调用和调参方法,极速上手;清晰的设计和代码注释,也很适合学习☆83Updated 3 years ago
- ChatGLM2-6B微调, SFT/LoRA, instruction finetune☆110Updated 2 years ago
- 基于pytorch的百度UIE命名实体识别。☆56Updated 2 years ago
- 基于词汇信息融合的中文NER模型☆170Updated 3 years ago
- GoGPT:基于Llama/Llama 2训练的中英文增强大模型|Chinese-Llama2☆79Updated 2 years ago
- experiments of some semantic matching models and comparison of experimental results.☆163Updated last month
- 基于pytorch + bert的多标签文本分类(multi label text classification)☆109Updated 2 years ago
- 使用LoRA对ChatGLM进行微调。☆49Updated 2 years ago
- 阿里天池: 2023全球智能汽车AI挑战赛——赛道一:AI大模型检索问答 baseline 80+☆117Updated last year
- Llama2-SFT, Llama-2-7B微调(transformers)/LORA(peft)/推理☆27Updated 2 years ago
- Summary and comparison of Chinese classification models☆36Updated 3 years ago
- 基于python的BM25文本匹配算法实现☆33Updated 3 years ago
- 使用sentence-transformers(SBert)训练自己的文本相似度数据集并进行评估。☆49Updated 4 years ago
- basic framework for rag(retrieval augment generation)☆86Updated last year
- ☆41Updated 4 years ago
- 大模型文本分类☆91Updated last year
- 中文无监督SimCSE Pytorch实现☆135Updated 4 years ago