Liyulingyue / ChatGLM-6B-PromptLinks
A prompt set of ChatGLM-6B
☆14Updated last year
Alternatives and similar repositories for ChatGLM-6B-Prompt
Users that are interested in ChatGLM-6B-Prompt are comparing it to the libraries listed below
Sorting:
- ggml implementation of the baichuan13b model (adapted from llama.cpp)☆54Updated last year
- The paddle implementation of meta's LLaMA.☆45Updated 2 years ago
- 实现Blip2RWKV+QFormer 的多模态图文对话大模型,使用Two-Step Cognitive Psychology Prompt方法,仅3B参数的模型便能够出现类人因果思维链。对标MiniGPT-4,ImageBind等图文对话大语言模型,力求以更小的算力和资源实…☆38Updated last year
- 使用Android cpu 运行 RWKV V4 ONNX☆68Updated last year
- zero零训练llm调参☆31Updated last year
- share data, prompt data , pretraining data☆36Updated last year
- run chatglm3-6b in BM1684X☆39Updated last year
- A more efficient GLM implementation!☆55Updated 2 years ago
- 全球首个StableVicuna中文优化版。☆64Updated last year
- Another ChatGLM2 implementation for GPTQ quantization☆54Updated last year
- ChatGLM-6B-Slim:裁减掉20K图片Token的ChatGLM-6B,完全一样的性能,占用更小的显存。☆126Updated 2 years ago
- GPT2⚡NCNN⚡中文对话⚡x86⚡Android☆79Updated 3 years ago
- ☆11Updated 2 years ago
- Finetune Llama 3, Mistral & Gemma LLMs 2-5x faster with 80% less memory☆27Updated last year
- GRAIN: Gradient-based Intra-attention Pruning on Pre-trained Language Models☆19Updated last year
- Kanchil(鼷鹿)是世界上最小的偶蹄目动物,这个开源项目意在探索小模型(6B以下)是否也能具备和人类偏好对齐的能力。☆113Updated 2 years ago
- rwkv finetuning☆36Updated last year
- run ChatGLM2-6B in BM1684X☆49Updated last year
- 纯c++的全平台llm加速库,支持python调用,支持baichuan, glm, llama, moss基座,手机端流畅运行chatglm-6B级模型单卡可达10000+token / s,☆45Updated last year
- XVERSE-65B: A multilingual large language model developed by XVERSE Technology Inc.☆139Updated last year
- 大语言模型训练和服务调研☆37Updated last year
- 重生之我是 AI 打工人。前世,我的身份默默无闻,来去匆匆,不知道自己将在何地出生。然而,命运给予了我难得的机会,让我重生为一名 AI 打工人。☆48Updated 2 years ago
- llama inference for tencentpretrain☆98Updated 2 years ago
- GLM Series Edge Models☆142Updated 2 weeks ago
- Its an open source LLM based on MOE Structure.☆58Updated 11 months ago
- Large-scale exact string matching tool☆17Updated 3 months ago
- XVERSE-MoE-A4.2B: A multilingual large language model developed by XVERSE Technology Inc.☆39Updated last year
- 想要从零开始训练一个中文的mini大语言模型,可以进行基本 的对话,模型大小根据手头的机器决定☆60Updated 10 months ago
- 百度QA100万数据集☆47Updated last year
- A light proxy solution for HuggingFace hub.☆47Updated last year