Efficient inference of large language models.
☆149Sep 28, 2025Updated 5 months ago
Alternatives and similar repositories for libllm
Users that are interested in libllm are comparing it to the libraries listed below
Sorting:
- UIE(Universal Information Extraction) infer by ncnn☆15Sep 22, 2024Updated last year
- ☆33Jul 23, 2024Updated last year
- Detect CPU features with single-file☆451Updated this week
- ☆16Mar 24, 2025Updated 11 months ago
- 分层解耦的深度学习推理引擎☆79Feb 17, 2025Updated last year
- Benchmark your NCNN models on 3DS(or crash)☆10Apr 15, 2024Updated last year
- A converter for llama2.c legacy models to ncnn models.☆79Dec 17, 2023Updated 2 years ago
- Infere RWKV on NCNN☆49Sep 3, 2024Updated last year
- ☆23Jan 3, 2024Updated 2 years ago
- 基于NCNN框架实现车道线检测(C/C++)☆24Apr 21, 2025Updated 10 months ago
- ppstructure deploy by ncnn☆35Jul 16, 2024Updated last year
- Self-trained Large Language Models based on Meta LLaMa☆29Aug 11, 2023Updated 2 years ago
- Whisper in TensorRT-LLM☆17Sep 21, 2023Updated 2 years ago
- linux bsp app & sample for axpi (ax620a)☆36Jun 21, 2023Updated 2 years ago
- An implementation of memcpy for amd64 with clang/gcc☆15Feb 7, 2022Updated 4 years ago
- An easy way to run, test, benchmark and tune OpenCL kernel files☆24Aug 25, 2023Updated 2 years ago
- ☆125Dec 15, 2023Updated 2 years ago
- Inferflow is an efficient and highly configurable inference engine for large language models (LLMs).