microsoft / onnxscriptLinks
ONNX Script enables developers to naturally author ONNX functions and models using a subset of Python.
☆412Updated this week
Alternatives and similar repositories for onnxscript
Users that are interested in onnxscript are comparing it to the libraries listed below
Sorting:
- Common utilities for ONNX converters☆288Updated 3 months ago
- onnxruntime-extensions: A specialized pre- and post- processing library for ONNX Runtime☆430Updated this week
- ONNX Optimizer☆781Updated last month
- The Triton backend for the ONNX Runtime.☆168Updated last week
- Visualize ONNX models with model-explorer☆64Updated last month
- Examples for using ONNX Runtime for model training.☆358Updated last year
- Accelerate PyTorch models with ONNX Runtime☆367Updated 9 months ago
- Representation and Reference Lowering of ONNX Models in MLIR Compiler Infrastructure☆940Updated 2 weeks ago
- torch::deploy (multipy for non-torch uses) is a system that lets you get around the GIL problem by running multiple Python interpreters i…☆182Updated 3 months ago
- An open-source efficient deep learning framework/compiler, written in python.☆737Updated 3 months ago
- Common source, scripts and utilities for creating Triton backends.☆361Updated 3 weeks ago
- Triton Model Analyzer is a CLI tool to help with better understanding of the compute and memory requirements of the Triton Inference Serv…☆500Updated this week
- 🤗 Optimum Intel: Accelerate inference with Intel optimization tools☆515Updated this week
- A Python-level JIT compiler designed to make unmodified PyTorch programs faster.☆1,068Updated last year
- A parser, editor and profiler tool for ONNX models.☆469Updated last month
- Triton Model Navigator is an inference toolkit designed for optimizing and deploying Deep Learning models with a focus on NVIDIA GPUs.☆213Updated 7 months ago
- Scailable ONNX python tools☆97Updated last year
- A Fusion Code Generator for NVIDIA GPUs (commonly known as "nvFuser")☆363Updated this week
- Model compression for ONNX☆99Updated last year
- Inference Vision Transformer (ViT) in plain C/C++ with ggml☆300Updated last year
- Generative AI extensions for onnxruntime☆901Updated this week
- Universal cross-platform tokenizers binding to HF and sentencepiece☆426Updated 4 months ago
- Backward compatible ML compute opset inspired by HLO/MHLO☆576Updated last week
- A Toolkit to Help Optimize Onnx Model☆267Updated last week
- A code generator from ONNX to PyTorch code☆141Updated 3 years ago
- Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. Th…☆427Updated this week
- PyTriton is a Flask/FastAPI-like interface that simplifies Triton's deployment in Python environments.☆830Updated 3 months ago
- Efficient in-memory representation for ONNX, in Python☆34Updated last week
- The Triton backend for the PyTorch TorchScript models.☆166Updated this week
- The Triton backend for TensorRT.☆80Updated 3 weeks ago