mit-han-lab / tinyengineLinks
[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning; [NeurIPS 2022] MCUNetV3: On-Device Training Under 256KB Memory
☆870Updated 6 months ago
Alternatives and similar repositories for tinyengine
Users that are interested in tinyengine are comparing it to the libraries listed below
Sorting:
- [NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep L…☆567Updated last year
- ☆903Updated last year
- On-Device Training Under 256KB Memory [NeurIPS'22]☆481Updated last year
- MLPerf™ Tiny is an ML benchmark suite for extremely low-power systems such as microcontrollers☆407Updated this week
- CMSIS-NN Library☆275Updated 2 weeks ago
- Arm Machine Learning tutorials and examples☆460Updated last week
- ☆228Updated 2 years ago
- Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. Th…☆398Updated this week
- vendor independent TinyML deep learning library, compiler and inference framework microcomputers and micro-controllers☆589Updated 2 years ago
- This is a list of interesting papers and projects about TinyML.☆869Updated 4 months ago
- Arm NN ML Software. The code here is a read-only mirror of https://review.mlplatform.org/admin/repos/ml/armnn☆1,262Updated this week
- Open Neural Network Exchange to C compiler.☆278Updated last month
- Infrastructure to enable deployment of ML models to low-power resource-constrained embedded targets (including microcontrollers and digit…☆2,278Updated this week
- A lightweight, portable pure C99 onnx inference engine for embedded devices with hardware acceleration support.☆619Updated 6 months ago
- Pure C ONNX runtime with zero dependancies for embedded devices☆206Updated last year
- ☆324Updated last year
- A curated list of resources for embedded AI☆425Updated last month
- A higher-level Neural Network library for microcontrollers.☆1,047Updated last year
- TFLite model analyzer & memory optimizer☆127Updated last year
- TinyChatEngine: On-Device LLM Inference Library☆852Updated 10 months ago
- AI Model Zoo for STM32 devices☆420Updated last week
- TinyMaix is a tiny inference library for microcontrollers (TinyML).☆970Updated 3 months ago
- TinyML Cookbook, published by Packt☆275Updated last year
- ONNX Optimizer☆715Updated this week
- TinyNeuralNetwork is an efficient and easy-to-use deep learning model compression framework.☆826Updated last week
- μNAS is a neural architecture search (NAS) system that designs small-yet-powerful microcontroller-compatible neural networks.☆80Updated 4 years ago
- An open-source efficient deep learning framework/compiler, written in python.☆698Updated this week
- LiteRT is the new name for TensorFlow Lite (TFLite). While the name is new, it's still the same trusted, high-performance runtime for on-…☆469Updated this week
- High-efficiency floating-point neural network inference operators for mobile, server, and Web☆2,030Updated this week
- A DNN inference latency prediction toolkit for accurately modeling and predicting the latency on diverse edge devices.☆351Updated 10 months ago