TylunasLi / fastllmLinks
纯c++的全平台llm加速库,支持python调用,支持chatglm-6B, llama, baichuan, moss基座,x86 / ARM
☆13Updated this week
Alternatives and similar repositories for fastllm
Users that are interested in fastllm are comparing it to the libraries listed below
Sorting:
- 收集优质的角色扮演聊天数据 | Collection of roleplay conversations of high quality☆15Updated 9 months ago
- 解决知识库搜索需要手动编排流和意图识别的问题☆15Updated 2 months ago
- ☆15Updated last year
- 介绍docker、docker compose的使用。☆20Updated 11 months ago
- 部署你自己的OpenAI api🤩, 基于flask, transformers (使用 Baichuan2-13B-Chat-4bits 模型, 可以运行在单张Tesla T4显卡) ,实现了OpenAI中Chat, Models和Completions接口,包含流式响…☆95Updated last year
- accelerate generating vector by using onnx model☆18Updated last year
- You can play any API server that compatible with OpenAI API☆24Updated last year
- ☆58Updated 10 months ago
- gpt_server是一个用于生产级部署LLMs、Embedding、Reranker、ASR和TTS的开源框架。☆206Updated last week
- Qwen-WisdomVast is a large model trained on 1 million high-quality Chinese multi-turn SFT data, 200,000 English multi-turn SFT data, and …☆18Updated last year
- share data, prompt data , pretraining data☆36Updated last year
- rwkv finetuning☆37Updated last year
- A fluent, scalable, and easy-to-use LLM data processing framework.☆25Updated last month
- ✅4g GPU可用 | 简易实现ChatGLM单机调用多个计算设备(GPU、CPU)进行推理☆34Updated 2 years ago
- Alpaca Chinese Dataset -- 中文指令微调数据集☆213Updated 10 months ago
- 使用langchain进行任务规划,构建子任务的会话场景资源,通过MCTS任务执行器,来让每个子任务通过在上下文中资源,通过自身反思探索来获取自身对问题的最优答案;这种方式依赖模型的对齐偏好,我们在每种偏好上设计了一个工程框架,来完成自我对不同答案的奖励进行采样策略☆29Updated last month
- A high-throughput and memory-efficient inference and serving engine for LLMs☆137Updated 8 months ago
- ☆157Updated last year
- llms related stuff , including code, docs☆13Updated 6 months ago
- SearchGPT: Building a quick conversation-based search engine with LLMs.☆47Updated 7 months ago
- NLP 项目记录档案☆58Updated 4 months ago
- qwen models finetuning☆103Updated 5 months ago
- 一起来养一只拥有专属记忆的AI猫猫吧!☆10Updated 10 months ago
- 基于大模型生成内容的智能语音对讲☆10Updated 9 months ago
- 利用LLM+敏感词库,来自动判别是否涉及敏感词。☆127Updated 2 years ago
- 从头开始训练一个chatglm小模型☆48Updated last year
- A high-throughput and memory-efficient inference and serving engine for LLMs☆132Updated last year
- 360zhinao☆291Updated 3 months ago
- 实现Blip2RWKV+QFormer的多模态图文对话大模型,使用Two-Step Cognitive Psychology Prompt方法,仅3B参数的模型便能够出现类人因果思维链。对标MiniGPT-4,ImageBind等图文对话大语言模型,力求以更小的算力和资源实…☆39Updated 2 years ago
- (1)弹性区间标准化的旋转位置词嵌入编码器+peft LORA量化训练,提高万级tokens性能支持。(2)证据理论解释学习,提升模型的复杂逻辑推理能力(3)兼容alpaca数据格式。☆45Updated 2 years ago