A lightweight, portable pure C99 onnx inference engine for embedded devices with hardware acceleration support.
☆647Aug 5, 2025Updated 6 months ago
Alternatives and similar repositories for libonnx
Users that are interested in libonnx are comparing it to the libraries listed below
Sorting:
- Pure C ONNX runtime with zero dependancies for embedded devices☆216Oct 29, 2023Updated 2 years ago
- The extensible bootloader for embedded system with application engine, write once, run everywhere.☆899Apr 12, 2025Updated 10 months ago
- A higher-level Neural Network library for microcontrollers.☆1,137Apr 8, 2024Updated last year
- TinyMaix is a tiny inference library for microcontrollers (TinyML).☆1,035Feb 5, 2025Updated last year
- Versaloon Software Framework -- a tiny preemptive-capable event-driven incremental software framework for embedded systems☆327Feb 17, 2026Updated 2 weeks ago
- Simplify your onnx model☆4,297Updated this week
- ☆25Sep 19, 2025Updated 5 months ago
- Buildroot Package for F1C100s/200s☆198Jul 18, 2023Updated 2 years ago
- MegCC是一个运行时超轻量,高效,移植简单的深度学习模型编译器☆486Oct 23, 2024Updated last year
- vendor independent TinyML deep learning library, compiler and inference framework microcomputers and micro-controllers☆602Jul 22, 2025Updated 7 months ago
- Tengine is a lite, high performance, modular inference engine for embedded device☆4,506Mar 6, 2025Updated 11 months ago
- Open Neural Network Exchange model parser in C☆16Jul 26, 2025Updated 7 months ago
- Reverse engineering the V831 npu☆95Jun 9, 2021Updated 4 years ago
- benchmark for embededded-ai deep learning inference engines, such as NCNN / TNN / MNN / TensorFlow Lite etc.☆202Feb 18, 2021Updated 5 years ago
- New operators for the ReferenceEvaluator, new kernels for onnxruntime, CPU, CUDA☆35Feb 13, 2026Updated 2 weeks ago
- Arm NN ML Software.☆1,298Jan 23, 2026Updated last month
- Tengine 管子是用来快速生产 demo 的辅助工具☆12Jul 15, 2021Updated 4 years ago
- Building an easy-to-use USB driver for soc☆12Jul 3, 2020Updated 5 years ago
- A flexible and efficient deep neural network (DNN) compiler that generates high-performance executable from a DNN model description.☆1,006Sep 19, 2024Updated last year
- TNN: developed by Tencent Youtu Lab and Guangying Lab, a uniform deep learning inference framework for mobile、desktop and server. TNN is …☆4,620May 9, 2025Updated 9 months ago
- Python TFLite scripts for detecting objects of any class in an image without knowing their label.☆53Sep 18, 2021Updated 4 years ago
- F1C100s with Keil RTX4 + emWin5☆152May 6, 2024Updated last year
- A lightweight, portable pure C99 NES emulator library.☆43Oct 21, 2024Updated last year
- lightcore source tree☆24May 3, 2023Updated 2 years ago
- Special Presentation Demo at Intel IoT Planet 2021 DeepLearning Day / インテル IoT プラネット 2021 DeepLearning Dayの特別講演の発表資料 https://www.intel.co…☆21Mar 6, 2022Updated 3 years ago
- Tiny FEL tools for allwinner SOC, support RISC-V D1 chip☆293Dec 19, 2025Updated 2 months ago
- ☆225Nov 3, 2025Updated 4 months ago
- CMake build system( framework) with kconfig support for C/CPP projects☆191Mar 31, 2024Updated last year
- Bolt is a deep learning library with high performance and heterogeneous flexibility.☆956Apr 11, 2025Updated 10 months ago
- ☆21Mar 18, 2021Updated 4 years ago
- ncnn is a high-performance neural network inference framework optimized for the mobile platform☆22,819Feb 20, 2026Updated last week
- High-efficiency floating-point neural network inference operators for mobile, server, and Web☆2,263Updated this week
- ☆25Aug 27, 2021Updated 4 years ago
- ☆11Nov 18, 2021Updated 4 years ago
- A library for high performance deep learning inference on NVIDIA GPUs.☆555Jan 29, 2022Updated 4 years ago
- Inference TinyLlama models on ncnn☆24Aug 15, 2023Updated 2 years ago
- Open Machine Learning Compiler Framework☆13,142Updated this week
- ffcnn is a cnn neural network inference framework, written in 600 lines C language.☆84Nov 28, 2025Updated 3 months ago
- Linux kernel source tree☆180Nov 15, 2021Updated 4 years ago