chraac / llama.cpp
LLM inference in C/C++
☆21Updated this week
Alternatives and similar repositories for llama.cpp:
Users that are interested in llama.cpp are comparing it to the libraries listed below
- Inference RWKV v5, v6 and (WIP) v7 with Qualcomm AI Engine Direct SDK☆49Updated last week
- 本项目是一个通过文字生成图片的项目,基于开源模型Stable Diffusion V1.5生成可以在手机的CPU和NPU上运行的模型,包括其配套的模型运行框架。☆135Updated 10 months ago
- The Qualcomm® AI Hub apps are a collection of state-of-the-art machine learning models optimized for performance (latency, memory etc.) a…☆113Updated last week
- A converter for llama2.c legacy models to ncnn models.☆86Updated last year
- llm-export can export llm model to onnx.☆257Updated last week