tensorflow / tflite-supportLinks
TFLite Support is a toolkit that helps users to develop ML and deploy TFLite models onto mobile / ioT devices.
☆406Updated last month
Alternatives and similar repositories for tflite-support
Users that are interested in tflite-support are comparing it to the libraries listed below
Sorting:
- LiteRT is the new name for TensorFlow Lite (TFLite). While the name is new, it's still the same trusted, high-performance runtime for on-…☆469Updated this week
- Supporting PyTorch models with the Google AI Edge TFLite runtime.☆620Updated this week
- A performant and modular runtime for TensorFlow☆761Updated last month
- A profiling and performance analysis tool for machine learning☆387Updated this week
- High-efficiency floating-point neural network inference operators for mobile, server, and Web☆2,039Updated this week
- Generate saved_model, tfjs, tf-trt, EdgeTPU, CoreML, quantized tflite, ONNX, OpenVINO, Myriad Inference Engine blob and .pb from .tflite.…☆269Updated 2 years ago
- Sample projects for TensorFlow Lite in C++ with delegates such as GPU, EdgeTPU, XNNPACK, NNAPI☆372Updated 2 years ago
- Convert TensorFlow Lite models (*.tflite) to ONNX.☆158Updated last year
- Common utilities for ONNX converters☆270Updated 6 months ago
- onnxruntime-extensions: A specialized pre- and post- processing library for ONNX Runtime☆390Updated this week
- An awesome list of TensorFlow Lite models, samples, tutorials, tools and learning resources.☆1,303Updated 3 years ago
- Convert tf.keras/Keras models to ONNX☆379Updated 3 years ago
- ☆229Updated 2 years ago
- ONNX Optimizer☆717Updated last week
- MLPerf™ Tiny is an ML benchmark suite for extremely low-power systems such as microcontrollers☆410Updated this week
- DistilBERT / GPT-2 for on-device inference thanks to TensorFlow Lite with Android demo apps☆409Updated last year
- Arm Machine Learning tutorials and examples☆461Updated 2 weeks ago
- ☆324Updated last year
- C++ API for ML inferencing and transfer-learning on Coral devices☆90Updated 9 months ago
- Examples for using ONNX Runtime for model training.☆338Updated 7 months ago
- Source code for the userspace level runtime driver for Coral.ai devices.☆198Updated 9 months ago
- ONNXMLTools enables conversion of models to ONNX☆1,088Updated this week
- Examples using TensorFlow Lite API to run inference on Coral devices☆186Updated 10 months ago
- Daquexian's NNAPI Library. ONNX + Android NNAPI☆350Updated 5 years ago
- This script converts the ONNX/OpenVINO IR model to Tensorflow's saved_model, tflite, h5, tfjs, tftrt(TensorRT), CoreML, EdgeTPU, ONNX and…☆341Updated 2 years ago
- Parse TFLite models (*.tflite) EASILY with Python. Check the API at https://zhenhuaw.me/tflite/docs/☆100Updated 4 months ago
- Arm NN ML Software. The code here is a read-only mirror of https://review.mlplatform.org/admin/repos/ml/armnn☆1,263Updated this week
- C++ Helper Class for Deep Learning Inference Frameworks: TensorFlow Lite, TensorRT, OpenCV, OpenVINO, ncnn, MNN, SNPE, Arm NN, NNabla, ON…☆287Updated 3 years ago
- TFLite model analyzer & memory optimizer☆127Updated last year
- Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. Th…☆398Updated last week