ling0322 / libllm
Efficient inference of large language models.
☆144Updated 2 months ago
Alternatives and similar repositories for libllm:
Users that are interested in libllm are comparing it to the libraries listed below
- ☆124Updated last year
- ncnn和pnnx格式编辑器☆130Updated 4 months ago
- Detect CPU features with single-file☆355Updated last month
- A converter for llama2.c legacy models to ncnn models.☆87Updated last year
- Inference TinyLlama models on ncnn☆24Updated last year
- Explore LLM model deployment based on AXera's AI chips☆80Updated last week
- Tiny C++11 GPT-2 inference implementation from scratch☆55Updated last month
- stable diffusion using mnn☆65Updated last year
- Benchmark your NCNN models on 3DS(or crash)☆9Updated 10 months ago
- llm deploy project based onnx.☆32Updated 4 months ago
- a program language for AI infrastructure☆87Updated 2 months ago
- 关于自建AI推理引擎的手册,从0开始你需要知道的所有事情☆259Updated 2 years ago
- Make a minimal OpenCV runable on any where, WIP☆83Updated 2 years ago
- Infere RWKV on NCNN☆48Updated 5 months ago
- ☆32Updated 6 months ago
- This is a demo how to write a high performance convolution run on apple silicon☆52Updated 3 years ago
- NeRF in NCNN with c++ & vulkan☆67Updated last year
- 分层解耦的深度学习推理引擎☆70Updated this week
- mperf是一个面向移动/嵌入式平台的算子性能调优工具箱☆178Updated last year
- MegCC是一个运行时超轻量,高效,移植简单的深度学习模型编译器☆474Updated 3 months ago
- GPT2⚡NCNN⚡中文对话⚡x86⚡Android☆79Updated 2 years ago
- 将MNN拆解的简易前向推理框架(for study!)☆20Updated 4 years ago
- OneFlow->ONNX☆42Updated last year
- an example of segment-anything infer by ncnn☆121Updated last year
- An easy way to run, test, benchmark and tune OpenCL kernel files☆23Updated last year
- Inference RWKV v5, v6 and (WIP) v7 with Qualcomm AI Engine Direct SDK☆52Updated this week
- 支持中文场景的的小语言模型 llama2.c-zh☆145Updated 11 months ago
- ppstructure deploy by ncnn☆27Updated 7 months ago
- ggml学习笔记,ggml是一个机器学习的推理框架☆14Updated 10 months ago
- DragGan in NCNN with c++☆50Updated last year