tensorflow / tflite-supportLinks
TFLite Support is a toolkit that helps users to develop ML and deploy TFLite models onto mobile / ioT devices.
☆416Updated 2 weeks ago
Alternatives and similar repositories for tflite-support
Users that are interested in tflite-support are comparing it to the libraries listed below
Sorting:
- Supporting PyTorch models with the Google AI Edge TFLite runtime.☆732Updated this week
- A profiling and performance analysis tool for machine learning☆405Updated this week
- LiteRT continues the legacy of TensorFlow Lite as the trusted, high-performance runtime for on-device AI. Now with LiteRT Next, we're exp…☆688Updated this week
- onnxruntime-extensions: A specialized pre- and post- processing library for ONNX Runtime☆404Updated last week
- High-efficiency floating-point neural network inference operators for mobile, server, and Web☆2,079Updated this week
- An awesome list of TensorFlow Lite models, samples, tutorials, tools and learning resources.☆1,323Updated 3 years ago
- Conversion of PyTorch Models into TFLite☆389Updated 2 years ago
- OpenVINO™ integration with TensorFlow☆179Updated last year
- Examples for using ONNX Runtime for model training.☆339Updated 9 months ago
- Convert TensorFlow Lite models (*.tflite) to ONNX.☆161Updated last year
- Convert tf.keras/Keras models to ONNX☆380Updated 3 years ago
- Sample projects for TensorFlow Lite in C++ with delegates such as GPU, EdgeTPU, XNNPACK, NNAPI☆373Updated 3 years ago
- ☆311Updated 2 weeks ago
- This script converts the ONNX/OpenVINO IR model to Tensorflow's saved_model, tflite, h5, tfjs, tftrt(TensorRT), CoreML, EdgeTPU, ONNX and…☆342Updated 2 years ago
- Build-related tools for TensorFlow☆298Updated 3 months ago
- ONNXMLTools enables conversion of models to ONNX☆1,099Updated last month
- A performant and modular runtime for TensorFlow☆758Updated this week
- Generate saved_model, tfjs, tf-trt, EdgeTPU, CoreML, quantized tflite, ONNX, OpenVINO, Myriad Inference Engine blob and .pb from .tflite.…☆269Updated 2 years ago
- TensorFlow/TensorRT integration☆743Updated last year
- Common utilities for ONNX converters☆276Updated 2 weeks ago
- Intel® AI Reference Models: contains Intel optimizations for running deep learning workloads on Intel® Xeon® Scalable processors and Inte…☆717Updated last week
- Tensorflow Backend for ONNX☆1,312Updated last year
- A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.☆1,546Updated this week
- ONNX Optimizer☆737Updated this week
- Build shared libraries (`.so`) to use TF Lite C++ API in Android applications☆50Updated 2 years ago
- ONNX Script enables developers to naturally author ONNX functions and models using a subset of Python.☆369Updated this week
- C++ API for ML inferencing and transfer-learning on Coral devices☆93Updated 11 months ago
- PyTorch to TensorFlow Lite converter☆185Updated last year
- Examples using TensorFlow Lite API to run inference on Coral devices☆186Updated last year
- Daquexian's NNAPI Library. ONNX + Android NNAPI☆350Updated 5 years ago