tensorflow / tflite-supportLinks
TFLite Support is a toolkit that helps users to develop ML and deploy TFLite models onto mobile / ioT devices.
☆433Updated 2 weeks ago
Alternatives and similar repositories for tflite-support
Users that are interested in tflite-support are comparing it to the libraries listed below
Sorting:
- OpenVINO™ integration with TensorFlow☆180Updated last year
- High-efficiency floating-point neural network inference operators for mobile, server, and Web☆2,245Updated this week
- DistilBERT / GPT-2 for on-device inference thanks to TensorFlow Lite with Android demo apps☆418Updated 2 years ago
- onnxruntime-extensions: A specialized pre- and post- processing library for ONNX Runtime☆441Updated this week
- Support PyTorch model conversion with LiteRT.☆930Updated this week
- Examples for using ONNX Runtime for model training.☆361Updated last year
- LiteRT, successor to TensorFlow Lite. is Google's On-device framework for high-performance ML & GenAI deployment on edge platforms, via e…☆1,399Updated last week
- A profiling and performance analysis tool for machine learning☆474Updated this week
- Tensorflow Backend for ONNX☆1,325Updated last year
- ONNX Optimizer☆795Updated this week
- An awesome list of TensorFlow Lite models, samples, tutorials, tools and learning resources.☆1,357Updated 3 years ago
- A performant and modular runtime for TensorFlow☆753Updated 5 months ago
- Conversion of PyTorch Models into TFLite☆399Updated 2 years ago
- Convert TensorFlow Lite models (*.tflite) to ONNX.☆171Updated 2 years ago
- Sample projects for TensorFlow Lite in C++ with delegates such as GPU, EdgeTPU, XNNPACK, NNAPI☆381Updated 3 years ago
- Build-related tools for TensorFlow☆300Updated 3 weeks ago
- Common utilities for ONNX converters☆293Updated last month
- ☆322Updated last month
- Dockerfiles and scripts for ONNX container images☆138Updated 3 years ago
- PyTorch to TensorFlow Lite converter☆183Updated last year
- This script converts the ONNX/OpenVINO IR model to Tensorflow's saved_model, tflite, h5, tfjs, tftrt(TensorRT), CoreML, EdgeTPU, ONNX and…☆345Updated 3 years ago
- Convert tf.keras/Keras models to ONNX☆382Updated 4 years ago
- Generate saved_model, tfjs, tf-trt, EdgeTPU, CoreML, quantized tflite, ONNX, OpenVINO, Myriad Inference Engine blob and .pb from .tflite.…☆272Updated 3 years ago
- ONNXMLTools enables conversion of models to ONNX☆1,140Updated last week
- A scalable inference server for models optimized with OpenVINO™☆823Updated this week
- TensorFlow wheels (whl) for aarch64 / ARMv8 / ARM64☆141Updated 2 years ago
- 🤗 Optimum Intel: Accelerate inference with Intel optimization tools☆532Updated this week
- A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.☆1,562Updated last week
- Parse TFLite models (*.tflite) EASILY with Python. Check the API at https://zhenhuaw.me/tflite/docs/☆104Updated last year
- Intel® AI Reference Models: contains Intel optimizations for running deep learning workloads on Intel® Xeon® Scalable processors and Inte…☆728Updated this week