kraiskil / onnx2c
Open Neural Network Exchange to C compiler.
☆259Updated last month
Alternatives and similar repositories for onnx2c:
Users that are interested in onnx2c are comparing it to the libraries listed below
- Pure C ONNX runtime with zero dependancies for embedded devices☆201Updated last year
- generate tflite micro code which bypasses the interpreter (directly calls into kernels)☆79Updated 2 years ago
- ☆218Updated last year
- TFLite model analyzer & memory optimizer☆122Updated last year
- CMSIS-NN Library☆245Updated last week
- MLPerf™ Tiny is an ML benchmark suite for extremely low-power systems such as microcontrollers☆381Updated 2 weeks ago
- vendor independent TinyML deep learning library, compiler and inference framework microcomputers and micro-controllers☆581Updated 2 years ago
- A parser, editor and profiler tool for ONNX models.☆418Updated last month
- Representation and Reference Lowering of ONNX Models in MLIR Compiler Infrastructure☆818Updated this week
- A lightweight, portable pure C99 onnx inference engine for embedded devices with hardware acceleration support.☆602Updated 3 months ago
- This project contains a code generator that produces static C NN inference deployment code targeting tiny micro-controllers (TinyML) as r…☆28Updated 3 years ago
- [NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep L…☆521Updated 11 months ago
- The EEMBC EnergyRunner application framework for the MLPerf Tiny benchmark.☆16Updated last year
- Supporting PyTorch models with the Google AI Edge TFLite runtime.☆462Updated this week
- QONNX: Arbitrary-Precision Quantized Neural Networks in ONNX☆137Updated this week
- Actively maintained ONNX Optimizer☆673Updated last month
- VeriSilicon Tensor Interface Module☆229Updated last month
- An optimized neural network operator library for chips base on Xuantie CPU.☆87Updated 8 months ago
- Inference Vision Transformer (ViT) in plain C/C++ with ggml☆257Updated 10 months ago
- MLPerf (tm) Tiny Deep Learning Benchmarks for STM32 devices☆13Updated 10 months ago
- Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. Th…☆366Updated this week
- ☆308Updated 2 months ago
- LiteRT is the new name for TensorFlow Lite (TFLite). While the name is new, it's still the same trusted, high-performance runtime for on-…☆292Updated this week
- Highly optimized inference engine for Binarized Neural Networks☆248Updated last month
- A simple library to deploy Keras neural networks in pure C for realtime applications☆87Updated 2 months ago
- Common utilities for ONNX converters☆259Updated 3 months ago
- TensorFlow Lite C/C++ library for microcontrollers.☆30Updated 4 years ago
- An MLIR-based toolchain for AMD AI Engine-enabled devices.☆341Updated this week
- Lightweight C implementation of CNNs for Embedded Systems☆60Updated 2 years ago
- DLPrimitives/OpenCL out of tree backend for pytorch☆323Updated 5 months ago