femnn / RecurrentGPT-zhLinks
☆27Updated 2 years ago
Alternatives and similar repositories for RecurrentGPT-zh
Users that are interested in RecurrentGPT-zh are comparing it to the libraries listed below
Sorting:
- ☆61Updated 2 years ago
- ChatGLM-6B-Slim:裁减掉20K图片Token的ChatGLM-6B,完全一样的性能,占用更 小的显存。☆126Updated 2 years ago
- 全球首个StableVicuna中文优化版。☆64Updated last year
- CodeGPT: A Code-Related Dialogue Dataset Generated by GPT and for GPT☆113Updated 2 years ago
- LLM with LuXun (鲁迅) style☆85Updated 2 years ago
- Repo for our new knowledge-based Chinese Medical Large Language Model, BianQue (扁鹊, Pien-Chueh). Coming soon.☆108Updated 2 years ago
- claude in slack api☆190Updated 2 years ago
- SuperCLUE琅琊榜:中文通用大模型匿名对战评价基准☆144Updated last year
- 类似于chatpdf的简化demo版☆191Updated 2 years ago
- MultilingualShareGPT, the free multi-language corpus for LLM training☆72Updated 2 years ago
- SuperCLUE-Role中文原生角色扮演测评基准☆33Updated last year
- “悟道”模型☆122Updated 3 years ago
- 使用甄嬛语料微调的chatglm☆85Updated 2 years ago
- Gaokao Benchmark for AI☆108Updated 2 years ago
- An Instruction-tuned Large Language Model for E-commerce☆246Updated last year
- 实现一种多Lora权值集成切换+Zero-Finetune零微调增强的跨模型技术方案,LLM-Base+LLM-X+Alpaca,初期,LLM-Base为Chatglm6B底座模型,LLM-X是LLAMA增强模型。该方案简易高效,目标是使此类语言模型能够低能耗广泛部署,并最…☆115Updated last year
- Customize APIs from GLM, ChatGLM☆68Updated 4 months ago
- The complete training code of the open-source high-performance Llama model, including the full process from pre-training to RLHF.☆69Updated 2 years ago
- ☆31Updated 2 years ago
- A simulation of world using GPTs. (depreciated)☆157Updated last year
- Light local website for displaying performances from different chat models.☆87Updated last year
- ☆91Updated last year
- 骆驼QA,中文大语言阅读理解模型。☆74Updated 2 years ago
- Chinese Couplets Dataset without vulgar words. 不包含敏感内容的对联数据集。☆73Updated 5 years ago
- chatglm-6b微调/LORA/PPO/推理, 样本为自动生成的整数/小数加减乘除运算, 可gpu/cpu☆164Updated last year
- moss chat finetuning☆50Updated last year
- ☆34Updated 3 years ago
- Stable Diffusion web UI☆112Updated 2 years ago
- 在中文开源大模型的基础上进行定制化的微调,拥有自己专 属的语言模型。☆47Updated 2 years ago
- ☆150Updated last year