Mingrui-Li / Qwen-VL-Lora-Model
可以成功Lora微调的Qwen-VL模型
☆18Updated last year
Alternatives and similar repositories for Qwen-VL-Lora-Model:
Users that are interested in Qwen-VL-Lora-Model are comparing it to the libraries listed below
- 补充了一些Visualglm缺少的文件,可以对Visualglm进行训练,实例中是对人脸做了面相的识别☆13Updated last year
- ☆21Updated 6 months ago
- LLM+RAG for QA☆21Updated last year
- 基于internlm-chat-7b的保险知识大模型微调☆15Updated 10 months ago
- 基于baichuan-7b的开源多模态大语言模型☆73Updated last year
- Baichuan-13B 指令微调☆89Updated last year
- 陆续开源医疗行业的深度学习模型及数据集☆13Updated 3 years ago
- This project is mainly to explore what effect can be achieved by fine-tuning LLM model (ChatGLM-6B)of about 6B in vertical field (Romance…☆25Updated last year
- ATEC2023——赛道一: 大模型的知识引入Rank7方案分享☆21Updated 4 months ago
- 想要从零开始训练一个中文的mini大语言模型,可以进行基本的对话,模型大小根据手头的机器决定☆58Updated 6 months ago
- 国内外数据竞赛资讯整理☆18Updated 3 years ago
- 基于ChatGLM2-6B进行微调,包括全参数、参数有效性、量化感知训练等,可实现指令微调、多轮对话微调等。☆25Updated last year
- 一套代码指令微调大模型☆38Updated last year
- 视觉信息抽取任务中,使用OCR识别结果规范多模态大模型的回答☆26Updated 2 months ago
- ChatGLM2-6B微调, SFT/LoRA, instruction finetune☆105Updated last year
- chatglm-6B for tools application using langchain☆75Updated last year
- 基于 LoRA 和 P-Tuning v2 的 ChatGLM-6B 高效参数微调☆54Updated last year
- Building a VLM model starts from the basic module.☆13Updated 10 months ago
- baichuan and baichuan2 finetuning and alpaca finetuning☆32Updated 10 months ago
- ☆25Updated 4 months ago
- baichuan LLM surpervised finetune by lora☆62Updated last year
- 通用版面分析 | 中文文档解析 |Document Layout Analysis | layout paser☆46Updated 8 months ago
- Llama2-SFT, Llama-2-7B微调(transformers)/LORA(peft)/推理☆24Updated last year
- Workshop on Foundation Model 1st foundation model challenge Track1 codebase (Open TransMind v1.0)☆18Updated last year
- A repo for update and debug Mixtral-7x8B、MOE、ChatGLM3、LLaMa2、 BaChuan、Qwen an other LLM models include new models mixtral, mixtral 8x7b, …☆43Updated this week
- Bert TensorRT模型加速部署☆9Updated 2 years ago
- 纯c++的全平台llm加速库,支持python调用,支持baichuan, glm, llama, moss基座,手机端流畅运行chatglm-6B级模型单卡可达10000+token / s,☆45Updated last year
- 天池算法比赛《BetterMixture - 大模型数据混合挑战赛》的第一名top1解决方案☆27Updated 7 months ago
- Qwen1.5-SFT(阿里, Ali), Qwen_Qwen1.5-2B-Chat/Qwen_Qwen1.5-7B-Chat微调(transformers)/LORA(peft)/推理☆53Updated 9 months ago
- 在RAG技术中,嵌入向量的生成和匹配是关键环节。本文介绍了一种基于CLIP/BLIP模型的嵌入服务,该服务支持文本和图像的嵌入生成与相似度计算,为多模态信息检索提供了基础能力。☆14Updated 2 months ago