ling0322 / libllmLinks
Efficient inference of large language models.
☆151Updated this week
Alternatives and similar repositories for libllm
Users that are interested in libllm are comparing it to the libraries listed below
Sorting:
- ncnn和pnnx格式编辑器☆137Updated 11 months ago
- ☆125Updated last year
- Detect CPU features with single-file☆420Updated last week
- A converter for llama2.c legacy models to ncnn models.☆81Updated last year
- Inference TinyLlama models on ncnn☆24Updated 2 years ago
- Inference RWKV v5, v6 and v7 with Qualcomm AI Engine Direct SDK☆81Updated this week
- GPT2⚡NCNN⚡中文对话⚡x86⚡Android☆81Updated 3 years ago
- Tiny C++ LLM inference implementation from scratch☆65Updated last week
- ☆33Updated last year
- llm deploy project based onnx.☆44Updated 11 months ago
- a simple general program language☆99Updated 2 weeks ago
- Infere RWKV on NCNN☆49Updated last year
- 关于自建AI推理引擎的手册,从0开始你需要知道的所有事情☆270Updated 3 years ago
- stable diffusion using mnn☆66Updated last year
- ☆84Updated 2 years ago
- MegCC是一个运行时超轻量,高效,移植简单的深度学习模型编译器☆487Updated 10 months ago
- Make a minimal OpenCV runable on any where, WIP☆84Updated 2 years ago
- ☆32Updated last year
- 分层解耦的深度学习推理引擎☆75Updated 6 months ago
- An easy way to run, test, benchmark and tune OpenCL kernel files☆23Updated 2 years ago
- Header-only safetensors loader and saver in C++☆68Updated 4 months ago
- ☆41Updated 2 years ago
- qwen2 and llama3 cpp implementation☆47Updated last year
- Benchmark your NCNN models on 3DS(or crash)☆10Updated last year
- NeRF in NCNN with c++ & vulkan☆68Updated 2 years ago
- CPM.cu is a lightweight, high-performance CUDA implementation for LLMs, optimized for end-device inference and featuring cutting-edge tec…☆190Updated this week
- DragGan in NCNN with c++☆52Updated last year
- Snapdragon Neural Processing Engine (SNPE) SDKThe Snapdragon Neural Processing Engine (SNPE) is a Qualcomm Snapdragon software accelerate…☆34Updated 3 years ago
- ☆27Updated 2 years ago
- A Toolkit to Help Optimize Large Onnx Model☆159Updated last year