kraiskil / onnx2c
Open Neural Network Exchange to C compiler.
☆225Updated last week
Related projects ⓘ
Alternatives and complementary repositories for onnx2c
- Pure C ONNX runtime with zero dependancies for embedded devices☆193Updated last year
- generate tflite micro code which bypasses the interpreter (directly calls into kernels)☆77Updated 2 years ago
- ☆201Updated last year
- TFLite model analyzer & memory optimizer☆120Updated 9 months ago
- QONNX: Arbitrary-Precision Quantized Neural Networks in ONNX☆127Updated 3 weeks ago
- CMSIS-NN Library☆208Updated last week
- An optimized neural network operator library for chips base on Xuantie CPU.☆86Updated 4 months ago
- Converting a deep neural network to integer-only inference in native C via uniform quantization and the fixed-point representation.☆21Updated 2 years ago
- vendor independent TinyML deep learning library, compiler and inference framework microcomputers and micro-controllers☆563Updated 2 years ago
- This project contains a code generator that produces static C NN inference deployment code targeting tiny micro-controllers (TinyML) as r…☆27Updated 3 years ago
- MLPerf™ Tiny is an ML benchmark suite for extremely low-power systems such as microcontrollers☆358Updated 3 weeks ago
- A parser, editor and profiler tool for ONNX models.☆400Updated this week
- Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. Th…☆328Updated this week
- [NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep L…☆480Updated 7 months ago
- A tool to deploy Deep Neural Networks on PULP-based SoC's☆79Updated 8 months ago
- The EEMBC EnergyRunner application framework for the MLPerf Tiny benchmark.☆16Updated last year
- muRISCV-NN is a collection of efficient deep learning kernels for embedded platforms and microcontrollers.☆64Updated this week
- Floating-Point Optimized On-Device Learning Library for the PULP Platform.☆28Updated last week
- Actively maintained ONNX Optimizer☆647Updated 8 months ago
- Supporting PyTorch models with the Google AI Edge TFLite runtime.☆371Updated this week
- Common utilities for ONNX converters☆251Updated 5 months ago
- Tool for the deployment and analysis of TinyML applications on TFLM and MicroTVM backends☆30Updated this week
- LiteRT is the new name for TensorFlow Lite (TFLite). While the name is new, it's still the same trusted, high-performance runtime for on-…☆149Updated last month
- A Plug-and-play Lightweight tool for the Inference Optimization of Deep Neural networks☆37Updated 3 weeks ago
- Highly optimized inference engine for Binarized Neural Networks☆243Updated 3 weeks ago
- Efficient Decision tree Ensembles libary for IoT edge nodes☆13Updated last month
- Bring your AI to the Edge - Starting from building the ML model to the selection of the target platform to the optimization and implement…☆52Updated 2 years ago
- CMix-NN: Mixed Low-Precision CNN Library for Memory-Constrained Edge Devices☆39Updated 4 years ago
- Scailable ONNX python tools☆96Updated 3 weeks ago
- [ICCAD'22 TinyML Contest] Efficient Heart Stroke Detection on Low-cost Microcontrollers☆15Updated last year