scchy / XtunerGUILinks
Xtuner Factory
☆35Updated last year
Alternatives and similar repositories for XtunerGUI
Users that are interested in XtunerGUI are comparing it to the libraries listed below
Sorting:
- Built on the robust XTuner backend framework, XTuner Chat GUI offers a user-friendly platform for quick and efficient local model inferen…☆13Updated last year
- 顾名思义:手搓的RAG☆130Updated last year
- In this fast-paced world, we all need a little something to spice up life. Whether you need a glass of sweet talk to lift your spirits or…☆60Updated 7 months ago
- Here is a demo for PDF parser (Including OCR, object detection tools)☆36Updated last year
- 基于《西游记》原文、白话文、ChatGPT生成数据制作的,以InternLM2微调的角色扮演多LLM聊天室。 本项目将介绍关于角色扮演类 LLM 的一切,从数据获取、数据处理,到使用 XTuner 微调并部署至 OpenXLab,再到使用 LMDeploy 部署,以 op…☆106Updated last year
- ☆79Updated last year
- 训练一个对中文支持更好的LLaVA模型,并开源训练代码和数据。☆77Updated last year
- GLM Series Edge Models☆156Updated 6 months ago
- A unified tool to generate fine-tuning datasets for LLMs, including questions, answers, and dialogues. ✨🤖📚💬☆63Updated 9 months ago
- 一些大语言模型和多模态模型的生态,主要包括跨模态搜索、投机解码、QAT量化、多模态量化、ChatBot、OCR☆194Updated 4 months ago
- 眼科问诊大模型☆100Updated last year
- 最简易的R1结果在小模型上的复现,阐述类O1与DeepSeek R1最重要的本质。Think is all your need。利用实验佐证,对于强推理能力,think思考过程性内容是AGI/ASI的核心。☆45Updated 10 months ago
- ☆72Updated last year
- Delta-CoMe can achieve near loss-less 1-bit compressin which has been accepted by NeurIPS 2024☆59Updated last year
- Qwen1.5-SFT(阿里, Ali), Qwen_Qwen1.5-2B-Chat/Qwen_Qwen1.5-7B-Chat微调(transformers)/LORA(peft)/推理☆69Updated last year
- ☆20Updated 2 years ago
- [ACL2025 demo track] ROGRAG: A Robustly Optimized GraphRAG Framework☆188Updated 2 weeks ago
- ☆95Updated last year
- ☆47Updated 8 months ago
- ☆28Updated last year
- ☆106Updated 9 months ago
- 基于baichuan-7b的开源多模态大语言模型☆72Updated 2 years ago
- Build a simple basic multimodal large model from scratch. 从零搭建一个简单的基础多模态大模型🤖☆47Updated last year
- SUS-Chat: Instruction tuning done right☆49Updated last year
- ☆234Updated last year
- ☆187Updated 10 months ago