onnxruntime-extensions: A specialized pre- and post- processing library for ONNX Runtime
☆453Apr 3, 2026Updated last week
Alternatives and similar repositories for onnxruntime-extensions
Users that are interested in onnxruntime-extensions are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- ONNX Script enables developers to naturally author ONNX functions and models using a subset of Python.☆433Apr 3, 2026Updated last week
- Common utilities for ONNX converters☆297Dec 16, 2025Updated 3 months ago
- Examples for using ONNX Runtime for machine learning inferencing.☆1,635Feb 24, 2026Updated last month
- ONNX Optimizer☆803Apr 2, 2026Updated last week
- Generative AI extensions for onnxruntime☆998Updated this week
- DigitalOcean Gradient AI Platform • AdBuild production-ready AI agents using customizable tools or access multiple LLMs through a single endpoint. Create custom knowledge bases or connect external data.
- ONNXMLTools enables conversion of models to ONNX☆1,144Apr 1, 2026Updated last week
- Olive: Simplify ML Model Finetuning, Conversion, Quantization, and Optimization for CPUs, GPUs and NPUs.☆2,289Updated this week
- Simplify your onnx model☆4,315Updated this week
- ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator☆19,779Updated this week
- The Triton backend for the ONNX Runtime.☆173Mar 18, 2026Updated 3 weeks ago
- Examples for using ONNX Runtime for model training.☆365Oct 23, 2024Updated last year
- Tutorial on how to convert machine learned models into ONNX☆14Mar 11, 2023Updated 3 years ago
- ☆19Mar 15, 2024Updated 2 years ago
- ☆10Jul 18, 2024Updated last year
- DigitalOcean Gradient AI Platform • AdBuild production-ready AI agents using customizable tools or access multiple LLMs through a single endpoint. Create custom knowledge bases or connect external data.
- Representation and Reference Lowering of ONNX Models in MLIR Compiler Infrastructure☆996Apr 3, 2026Updated last week
- A tool to modify ONNX models in a visualization fashion, based on Netron and Flask.☆1,621Nov 19, 2025Updated 4 months ago
- ONNX Serving is a project written with C++ to serve onnx-mlir compiled models with GRPC and other protocols.Benefiting from C++ implement…☆26Sep 17, 2025Updated 6 months ago
- transformer tokenizers (e.g. BERT tokenizer) in C++ (WIP)☆18Apr 7, 2022Updated 4 years ago
- Convert TensorFlow, Keras, Tensorflow.js and Tflite models to ONNX☆2,526Apr 2, 2026Updated last week
- A tool for parsing, editing, optimizing, and profiling ONNX models.☆482Mar 11, 2026Updated last month
- Scailable ONNX python tools☆98Oct 25, 2024Updated last year
- ONNX-TensorRT: TensorRT backend for ONNX☆3,201Mar 25, 2026Updated 2 weeks ago
- Useful tensorrt plugin. For pytorch and mmdetection model conversion.☆165Oct 8, 2024Updated last year
- Proton VPN Special Offer - Get 70% off • AdSpecial partner offer. Trusted by over 100 million users worldwide. Tested, Approved and Recommended by Experts.
- Open standard for machine learning interoperability☆20,584Apr 3, 2026Updated last week
- 🚀 Accelerate inference and training of 🤗 Transformers, Diffusers, TIMM and Sentence Transformers with easy to use hardware optimization…☆3,348Apr 2, 2026Updated last week
- PyTorch/TorchScript/FX compiler for NVIDIA GPUs using TensorRT☆2,962Updated this week
- A collection of pre-trained, state-of-the-art models in the ONNX format☆9,521Mar 9, 2026Updated last month
- Model compression for ONNX☆100Mar 1, 2026Updated last month
- QONNX: Arbitrary-Precision Quantized Neural Networks in ONNX☆180Mar 25, 2026Updated 2 weeks ago
- Use safetensors with ONNX 🤗☆89Mar 31, 2026Updated last week
- Sample projects for InferenceHelper, a Helper Class for Deep Learning Inference Frameworks: TensorFlow Lite, TensorRT, OpenCV, ncnn, MNN,…☆22Mar 27, 2022Updated 4 years ago
- Productionize machine learning predictions, with ONNX or without☆66Jan 11, 2024Updated 2 years ago
- 1-Click AI Models by DigitalOcean Gradient • AdDeploy popular AI models on DigitalOcean Gradient GPU virtual machines with just a single click and start building anything your business needs.
- Efficient, scalable and enterprise-grade CPU/GPU inference server for 🤗 Hugging Face transformer models 🚀☆1,688Oct 23, 2024Updated last year
- YuNetのPythonでのONNX、TensorFlow-Lite推論サンプル☆20Nov 17, 2021Updated 4 years ago
- ☆26Dec 1, 2020Updated 5 years ago
- SOTA low-bit LLM quantization (INT8/FP8/MXFP8/INT4/MXFP4/NVFP4) & sparsity; leading model compression techniques on PyTorch, TensorFlow, …☆2,612Updated this week
- A tool convert TensorRT engine/plan to a fake onnx☆41Nov 22, 2022Updated 3 years ago
- Convert ONNX models to PyTorch.☆734Oct 14, 2025Updated 5 months ago
- NVIDIA® TensorRT™ is an SDK for high-performance deep learning inference on NVIDIA GPUs. This repository contains the open source compone…☆12,877Mar 25, 2026Updated 2 weeks ago