singularity-s0 / MOSS_frontendLinks
Frontend for the MOSS chatbot.
☆48Updated last year
Alternatives and similar repositories for MOSS_frontend
Users that are interested in MOSS_frontend are comparing it to the libraries listed below
Sorting:
- backend for fastnlp MOSS project☆59Updated last year
- Moss Vortex is a lightweight and high-performance deployment and inference backend engineered specifically for MOSS 003, providing a weal…☆37Updated 2 years ago
- Kanchil(鼷鹿)是世界上最小的偶蹄目动物,这个开源项目意在探索小模型(6B以下)是否也能具备和人类偏好对齐的能力。☆113Updated 2 years ago
- XVERSE-MoE-A4.2B: A multilingual large language model developed by XVERSE Technology Inc.☆39Updated last year
- 实现Blip2RWKV+QFormer的多模态图文对话大模型,使用Two-Step Cognitive Psychology Prompt方法,仅3B参数的模型便能够出现类人因果思维链。对标MiniGPT-4,ImageBind等图文对话大语言模型,力求以更小的算力和资源实…☆38Updated 2 years ago
- XVERSE-65B: A multilingual large language model developed by XVERSE Technology Inc.☆139Updated last year
- MOSS 003 WebSearchTool: A simple but reliable implementation☆45Updated 2 years ago
- rwkv finetuning☆36Updated last year
- 实现一种多Lora权值集成切换+Zero-Finetune零微调增强的跨模型技术方案,LLM-Base+LLM-X+Alpaca,初期,LLM-Base为Chatglm6B底座模型,LLM-X是LLAMA增强模型。该方案简易高效,目标是使此类语言模型能够低能耗广泛部署,并最…☆116Updated last year
- ✅4g GPU可用 | 简易实现ChatGLM单机调用多个计算设备(GPU、CPU)进行推理☆34Updated 2 years ago
- ChatGLM-6B-Slim:裁减掉20K图片Token的ChatGLM-6B,完全一样的性能,占用更小的显存。☆126Updated 2 years ago
- A more efficient GLM implementation!☆55Updated 2 years ago
- Its an open source LLM based on MOE Structure.☆58Updated last year
- 全球首个StableVicuna中文优化版。☆64Updated 2 years ago
- zero零训练llm调参☆31Updated last year
- SUS-Chat: Instruction tuning done right☆48Updated last year
- Repo for our new knowledge-based Chinese Medical Large Language Model, BianQue (扁鹊, Pien-Chueh). Coming soon.☆108Updated 2 years ago
- The official codes for "Aurora: Activating chinese chat capability for Mixtral-8x7B sparse Mixture-of-Experts through Instruction-Tuning"☆263Updated last year
- 首个llama2 13b 中文版模型 (Base + 中文对话SFT,实现流畅多轮人机自然语言交互)☆90Updated last year
- CamelBell(驼铃) is be a Chinese Language Tuning project based on LoRA. CamelBell is belongs to Project Luotuo(骆驼), an open sourced Chinese-…☆173Updated last year
- “悟道”模型☆122Updated 3 years ago
- share data, prompt data , pretraining data☆36Updated last year
- A light proxy solution for HuggingFace hub.☆47Updated last year
- GLM Series Edge Models☆144Updated last month
- deep learning☆148Updated 2 months ago
- Another ChatGLM2 implementation for GPTQ quantization☆54Updated last year
- 基于中文法律知识的ChatGLM指令微调☆44Updated 2 years ago
- The Silk Magic Book will record the Magic Prompts on some very Large LLMs. The Silk Magic Book belongs to the project Luotuo(骆驼), which c…☆56Updated 2 years ago
- 基于baichuan-7b的开源多模态大语言模型☆73Updated last year
- Gaokao Benchmark for AI☆108Updated 3 years ago