PINTO0309 / tflite2json2tflite
Convert tflite to JSON and make it editable in the IDE. It also converts the edited JSON back to tflite binary.
☆27Updated 2 years ago
Alternatives and similar repositories for tflite2json2tflite:
Users that are interested in tflite2json2tflite are comparing it to the libraries listed below
- A very simple tool for situations where optimization with onnx-simplifier would exceed the Protocol Buffers upper file size limit of 2GB,…☆16Updated 9 months ago
- Exports the ONNX file to a JSON file and JSON dict.☆32Updated 2 years ago
- Roughly calculate FLOPs of a tflite model☆37Updated 3 years ago
- Simple tool for partial optimization of ONNX. Further optimize some models that cannot be optimized with onnx-optimizer and onnxsim by se…☆19Updated 9 months ago
- A set of simple tools for splitting, merging, OP deletion, size compression, rewriting attributes and constants, OP generation, change op…☆285Updated 9 months ago
- AI Edge Quantizer: flexible post training quantization for LiteRT models.☆23Updated this week
- Model compression for ONNX☆86Updated 3 months ago
- Count number of parameters / MACs / FLOPS for ONNX models.☆90Updated 3 months ago
- A very simple tool that compresses the overall size of the ONNX model by aggregating duplicate constant values as much as possible.☆52Updated 2 years ago
- Very simple NCHW and NHWC conversion tool for ONNX. Change to the specified input order for each and every input OP. Also, change the cha…☆23Updated 8 months ago
- Parse TFLite models (*.tflite) EASILY with Python. Check the API at https://zhenhuaw.me/tflite/docs/☆98Updated 3 weeks ago
- Inference of quantization aware trained networks using TensorRT☆80Updated 2 years ago
- Convert TensorFlow Lite models (*.tflite) to ONNX.☆152Updated last year
- Build TensorFlow Lite runtime with GitHub Actions☆25Updated last year
- A Toolkit to Help Optimize Large Onnx Model☆153Updated 9 months ago
- Scailable ONNX python tools☆96Updated 3 months ago
- ONNX model visualizer☆85Updated last year
- TFLite model analyzer & memory optimizer☆122Updated last year
- This script converts the ONNX/OpenVINO IR model to Tensorflow's saved_model, tflite, h5, tfjs, tftrt(TensorRT), CoreML, EdgeTPU, ONNX and…☆340Updated 2 years ago
- ONNX Command-Line Toolbox☆35Updated 4 months ago
- A code generator from ONNX to PyTorch code☆135Updated 2 years ago
- ☆22Updated last week
- ☆136Updated last year
- PyTorch Quantization Aware Training Example☆128Updated 9 months ago
- A parser, editor and profiler tool for ONNX models.☆414Updated last month
- Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. Th…☆361Updated this week
- Improving Post Training Neural Quantization: Layer-wise Calibration and Integer Programming☆96Updated 3 years ago
- Generate saved_model, tfjs, tf-trt, EdgeTPU, CoreML, quantized tflite, ONNX, OpenVINO, Myriad Inference Engine blob and .pb from .tflite.…☆268Updated 2 years ago
- PyTorch Static Quantization Example☆38Updated 3 years ago
- Benchmark inference speed of CNNs with various quantization methods in Pytorch+TensorRT with Jetson Nano/Xavier☆55Updated last year