clue-ai / ChatYuan-7B
ChatYuan-7B
☆13Updated last year
Alternatives and similar repositories for ChatYuan-7B:
Users that are interested in ChatYuan-7B are comparing it to the libraries listed below
- A more efficient GLM implementation!☆55Updated 2 years ago
- 骆驼QA,中文大语言阅读理解模型。☆75Updated last year
- 全球首个StableVicuna中文优化版。☆64Updated last year
- llama inference for tencentpretrain☆98Updated last year
- use chatGLM to perform text embedding☆45Updated last year
- 百川Dynamic NTK-ALiBi的代码实现:无需微调即可推理更长文本☆47Updated last year
- 实现一种多Lora权值集成切换+Zero-Finetune零微调增强的跨模型技术方案,LLM-Base+LLM-X+Alpaca,初期,LLM-Base为Chatglm6B底座模型,LLM-X是LLAMA增强模型。该方案简易高效,目标是使此类语言模型能够低能耗广泛部署,并最…☆117Updated last year
- Simple implementation of using lora form the peft library to fine-tune the chatglm-6b☆85Updated last year
- moss chat finetuning☆50Updated 10 months ago
- chatglm_rlhf_finetuning☆28Updated last year
- chatglm-6b微调/LORA/PPO/推理, 样本为自动生成的整数/小数加减乘除运算, 可gpu/cpu☆164Updated last year
- 演示 vllm 对中文大语言模型的神奇效果☆31Updated last year
- (NBCE)Naive Bayes-based Context Extension on ChatGLM-6b☆14Updated last year
- Another ChatGLM2 implementation for GPTQ quantization☆54Updated last year
- ChatGLM-6B fine-tuning.☆135Updated last year
- 零样本学习测评基准 ,中文版☆55Updated 3 years ago
- Silk Road will be the dataset zoo for Luotuo(骆驼). Luotuo is an open sourced Chinese-LLM project founded by 陈启源 @ 华中师范大学 & 李鲁鲁 @ 商汤科技 & 冷子…☆38Updated last year
- ☆34Updated 3 years ago
- ☆43Updated last year
- NLU & NLG (zero-shot) depend on mengzi-t5-base-mt pretrained model☆75Updated 2 years ago
- The complete training code of the open-source high-performance Llama model, including the full process from pre-training to RLHF.☆65Updated last year
- A light proxy solution for HuggingFace hub.☆46Updated last year
- ColossalChat is the project to implement LLM with RLHF, powered by the Colossal-AI project.☆63Updated 10 months ago
- Light local website for displaying performances from different chat models.☆85Updated last year
- 纯c++的全平台llm加速库,支持python调用,支持baichuan, glm, llama, moss基座,手机端流畅运行chatglm-6B级模型单卡可达10000+token / s,☆45Updated last year
- ChatGLM-6B-Slim:裁减掉20K图片Token的ChatGLM-6B,完全一样的性能,占用更小的显存。☆126Updated last year
- 旨在对当前主流LLM进行一个直观、具体、标准的评测☆93Updated last year
- code for Scaling Laws of RoPE-based Extrapolation☆70Updated last year
- deep learning☆150Updated this week
- Gaokao Benchmark for AI☆108Updated 2 years ago