google-ai-edge / ai-edge-torch
Supporting PyTorch models with the Google AI Edge TFLite runtime.
☆569Updated this week
Alternatives and similar repositories for ai-edge-torch
Users that are interested in ai-edge-torch are comparing it to the libraries listed below
Sorting:
- LiteRT is the new name for TensorFlow Lite (TFLite). While the name is new, it's still the same trusted, high-performance runtime for on-…☆386Updated this week
- Self-Created Tools to convert ONNX files (NCHW) to TensorFlow/TFLite/Keras format (NHWC). The purpose of this tool is to solve the massiv…☆791Updated last month
- Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. Th…☆394Updated last week
- Pytorch to Keras/Tensorflow/TFLite conversion made intuitive☆308Updated 2 months ago
- ONNX Script enables developers to naturally author ONNX functions and models using a subset of Python.☆349Updated this week
- Conversion of PyTorch Models into TFLite☆375Updated 2 years ago
- The Qualcomm® AI Hub Models are a collection of state-of-the-art machine learning models optimized for performance (latency, memory etc.)…☆687Updated this week
- AI Edge Quantizer: flexible post training quantization for LiteRT models.☆32Updated this week
- onnxruntime-extensions: A specialized pre- and post- processing library for ONNX Runtime☆385Updated this week
- ☆323Updated last year
- A set of simple tools for splitting, merging, OP deletion, size compression, rewriting attributes and constants, OP generation, change op…☆292Updated last year
- nvidia-modelopt is a unified library of state-of-the-art model optimization techniques like quantization, pruning, distillation, speculat…☆909Updated last week
- On-device AI across mobile, embedded and edge for PyTorch☆2,829Updated this week
- This script converts the ONNX/OpenVINO IR model to Tensorflow's saved_model, tflite, h5, tfjs, tftrt(TensorRT), CoreML, EdgeTPU, ONNX and…☆341Updated 2 years ago
- A parser, editor and profiler tool for ONNX models.☆431Updated 4 months ago
- 🤗 Optimum Intel: Accelerate inference with Intel optimization tools☆464Updated this week
- ☆136Updated 2 months ago
- Examples for using ONNX Runtime for machine learning inferencing.☆1,375Updated this week
- The Qualcomm® AI Hub apps are a collection of state-of-the-art machine learning models optimized for performance (latency, memory etc.) a…☆189Updated this week
- Common utilities for ONNX converters☆268Updated 5 months ago
- Neural Network Compression Framework for enhanced OpenVINO™ inference☆1,009Updated this week
- ONNX Optimizer☆707Updated 2 weeks ago
- TFLite Support is a toolkit that helps users to develop ML and deploy TFLite models onto mobile / ioT devices.☆405Updated 3 weeks ago
- TFLite model analyzer & memory optimizer☆126Updated last year
- Model compression for ONNX☆92Updated 5 months ago
- AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.☆2,302Updated this week
- Generative AI extensions for onnxruntime☆710Updated this week
- A pytorch quantization backend for optimum☆935Updated 3 weeks ago
- Visualize ONNX models with model-explorer☆33Updated 2 months ago
- PyTorch native quantization and sparsity for training and inference☆2,030Updated this week