alrevuelta / cONNXr
Pure C ONNX runtime with zero dependancies for embedded devices
☆199Updated last year
Alternatives and similar repositories for cONNXr:
Users that are interested in cONNXr are comparing it to the libraries listed below
- Open Neural Network Exchange to C compiler.☆247Updated this week
- generate tflite micro code which bypasses the interpreter (directly calls into kernels)☆79Updated 2 years ago
- CMSIS-NN Library☆231Updated last month
- vendor independent TinyML deep learning library, compiler and inference framework microcomputers and micro-controllers☆571Updated 2 years ago
- ☆213Updated last year
- MLPerf™ Tiny is an ML benchmark suite for extremely low-power systems such as microcontrollers☆368Updated 3 weeks ago
- A lightweight, portable pure C99 onnx inference engine for embedded devices with hardware acceleration support.☆595Updated last month
- TFLite model analyzer & memory optimizer☆121Updated 11 months ago
- Arm Machine Learning tutorials and examples☆443Updated last month
- Mobilenet v1 trained on Imagenet for STM32 using extended CMSIS-NN with INT-Q quantization support☆86Updated 4 years ago
- The C++ Neural Network and Machine Learning project is intended to provide a C++ template library for neural nets and machine learning al…☆79Updated 2 years ago
- Lightweight C implementation of CNNs for Embedded Systems☆58Updated last year
- Converting a deep neural network to integer-only inference in native C via uniform quantization and the fixed-point representation.☆22Updated 2 years ago
- Quantization and Synthesis (Device Specific Code Generation) for ADI's MAX78000 and MAX78002 Edge AI Devices☆58Updated last month
- The EEMBC EnergyRunner application framework for the MLPerf Tiny benchmark.☆15Updated last year
- ☆30Updated 3 years ago
- Efficient neural network deployment for uC using pytorch model☆32Updated 4 years ago
- A simple library to deploy Keras neural networks in pure C for realtime applications☆87Updated last month
- Machine Learning inference engine for Microcontrollers and Embedded devices☆545Updated last week
- Arm NN ML Software. The code here is a read-only mirror of https://review.mlplatform.org/admin/repos/ml/armnn☆1,215Updated last week
- Highly optimized inference engine for Binarized Neural Networks☆245Updated 2 months ago
- [NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep L…☆501Updated 9 months ago
- Parse TFLite models (*.tflite) EASILY with Python. Check the API at https://zhenhuaw.me/tflite/docs/☆97Updated last year
- attentional sequence-to-sequence model (based on LSTMs) in TFLite Micro, tested on Arduino Nano 33 BLE☆26Updated 2 years ago
- QONNX: Arbitrary-Precision Quantized Neural Networks in ONNX☆134Updated this week
- ☆305Updated 3 weeks ago
- FInd arena size for TensorFlow Lite models☆27Updated last year
- ONNX Runtime Inference C++ Example☆227Updated last year
- Portable C++ library for signal processing and machine learning inferencing☆80Updated last month
- A lightweight C library for artificial neural networks☆690Updated 3 years ago