BaihaiAI / IDPChatLinks
IDPChat是开放的中文多模态模型
☆55Updated 2 years ago
Alternatives and similar repositories for IDPChat
Users that are interested in IDPChat are comparing it to the libraries listed below
Sorting:
- Quickly build services to integrate your local data and AI models.☆456Updated 2 years ago
- Provide OpenAI style API for ChatGLM-6B and Chinese Embeddings Model☆512Updated 2 years ago
- IDP is an open source AI IDE for data scientists and big data engineers.☆216Updated 10 months ago
- Customize APIs from GLM, ChatGLM☆71Updated 10 months ago
- 基于chatglm快速搭建文档问答机器人☆89Updated 2 years ago
- OrionStar-Yi-34B-Chat 是一款开源中英文Chat模型,由猎户星空基于Yi-34B开源模型、使用15W+高质量语料微调而成。☆262Updated last year
- Humanable Chat Generative-model Fine-tuning | LLM微调☆206Updated 2 years ago
- 后chatgpt时代的对话式文档问答解决方案☆110Updated 2 years ago
- Play LLaMA2 (official / 中文版 / INT4 / llama2.cpp) Together! ONLY 3 STEPS! ( non GPU / 5GB vRAM / 8~14GB vRAM)☆541Updated 2 years ago
- Phi3 中文后训练模型仓库☆324Updated last year
- Repo for our new knowledge-based Chinese Medical Large Language Model, BianQue (扁鹊, Pien-Chueh). Coming soon.☆108Updated 2 years ago
- 首个llama2 13b 中文版模型 (Base + 中文对话SFT,实现流畅多轮人机自然语言交互)☆91Updated 2 years ago
- 类似于chatpdf的简化demo版☆193Updated 2 years ago
- Implement OpenAI APIs and plugin-enabled ChatGPT with open source LLM and other models.☆121Updated last year
- ☆348Updated last year
- langchain 工具,流程设计组件,服务,代理以及相关学习文档的合集(agent,service,tutorials,flow-design)☆135Updated last year
- Salesforce codegen with web server☆193Updated 11 months ago
- 专为 l15y/wenda 闻达平台设计的webui☆443Updated 2 years ago
- self-host ChatGLM-6B API made with fastapi☆79Updated 2 years ago
- XVERSE-13B: A multilingual large language model developed by XVERSE Technology Inc.☆645Updated last year
- something like visual-chatgpt, 文心一言的开源版☆1,201Updated last year
- “悟道”模型☆131Updated 4 years ago
- ☆70Updated last year
- visual-chatgpt支持中文版本☆286Updated 2 years ago
- ☆137Updated 2 years ago
- langchain baidu wenxinworkshop wrapper☆69Updated 2 years ago
- Llama3-Chinese是以Meta-Llama-3-8B为底座,使用 DORA + LORA+ 的训练方法,在50w高质量中文多轮SFT数据 + 10w英文多轮SFT数据 + 2000单轮自我认知数据训练而来的大模型。☆295Updated last year
- This repo contains the data preparation, tokenization, training and inference code for BLOOMChat. BLOOMChat is a 176 billion parameter mu…☆585Updated 2 years ago
- LocalAGI:Locally run AGI powered by LLaMA, ChatGLM and more. | 基于 ChatGLM, LLaMA 大模型的本地运行的 AGI☆81Updated 2 years ago
- 使用甄嬛语料微调的chatglm☆86Updated 2 years ago