NobuoTsukamoto / meta-tensorflow-liteLinks
Yocto layer for TensorFlow Lite interpreter with Python / C++.
☆37Updated last month
Alternatives and similar repositories for meta-tensorflow-lite
Users that are interested in meta-tensorflow-lite are comparing it to the libraries listed below
Sorting:
- minimal example with native build instructions☆24Updated last year
- Build TensorFlow Lite runtime with GitHub Actions☆25Updated last month
- Convert tflite to JSON and make it editable in the IDE. It also converts the edited JSON back to tflite binary.☆27Updated 2 years ago
- Provide Docker build sequences of PyTorch for various environments.☆16Updated 4 years ago
- YOLOX-ti-lite models exportable to TFLite☆21Updated 2 years ago
- ☆25Updated last week
- ONNX model visualizer☆88Updated 2 years ago
- ☆80Updated last month
- edge/mobile transformer based Vision DNN inference benchmark☆16Updated 2 weeks ago
- Source code for the userspace level runtime driver for Coral.ai devices.☆200Updated last year
- ☆13Updated last week
- Semi-automated OpenVINO benchmark_app with variable parameters. User can specify multiple options for any parameters in the benchmark_app…☆10Updated 3 years ago
- Extension package of Apache TVM (Machine Learning Compiler) for Renesas DRP-AI accelerators powered by Edgecortix MERA(TM) Based Apache T…☆54Updated 2 months ago
- Simple tool for partial optimization of ONNX. Further optimize some models that cannot be optimized with onnx-optimizer and onnxsim by se…☆19Updated last year
- Model compression for ONNX☆97Updated 9 months ago
- An open source light-weight and high performance inference framework for Hailo devices☆128Updated this week
- ☆238Updated 2 years ago
- Parse TFLite models (*.tflite) EASILY with Python. Check the API at https://zhenhuaw.me/tflite/docs/☆101Updated 7 months ago
- Very simple NCHW and NHWC conversion tool for ONNX. Change to the specified input order for each and every input OP. Also, change the cha…☆25Updated 5 months ago
- Yocto/OE-core BSP Layer for Coral Dev Board☆42Updated last year
- C++ API for ML inferencing and transfer-learning on Coral devices☆93Updated last year
- A very simple tool that compresses the overall size of the ONNX model by aggregating duplicate constant values as much as possible.☆52Updated 3 years ago
- New operators for the ReferenceEvaluator, new kernels for onnxruntime, CPU, CUDA☆35Updated last week
- TensorFlow, TensorFlow-Lite Pytorch, Torchvision, TensorRT Benchmarks☆23Updated 9 months ago
- OpenCV Sample Projects in Rust☆12Updated 3 years ago
- AI Edge Quantizer: flexible post training quantization for LiteRT models.☆64Updated this week
- TensorFlow Lite, Coral Edge TPU samples (Python/C++, Raspberry Pi/Windows/Linux).☆122Updated last year
- Template project for performing inference on ONNX models in Python.☆17Updated 3 years ago
- Exports the ONNX file to a JSON file and JSON dict.☆33Updated 2 years ago
- A very simple tool for situations where optimization with onnx-simplifier would exceed the Protocol Buffers upper file size limit of 2GB,…☆17Updated last year