microsoft / onnxscriptLinks
ONNX Script enables developers to naturally author ONNX functions and models using a subset of Python.
☆360Updated this week
Alternatives and similar repositories for onnxscript
Users that are interested in onnxscript are comparing it to the libraries listed below
Sorting:
- Common utilities for ONNX converters☆274Updated last week
- onnxruntime-extensions: A specialized pre- and post- processing library for ONNX Runtime☆397Updated this week
- ONNX Optimizer☆727Updated this week
- Examples for using ONNX Runtime for model training.☆338Updated 8 months ago
- Representation and Reference Lowering of ONNX Models in MLIR Compiler Infrastructure☆881Updated last week
- The Triton backend for the ONNX Runtime.☆153Updated this week
- Accelerate PyTorch models with ONNX Runtime☆362Updated 4 months ago
- An open-source efficient deep learning framework/compiler, written in python.☆707Updated 3 weeks ago
- 🤗 Optimum Intel: Accelerate inference with Intel optimization tools☆475Updated this week
- A Python-level JIT compiler designed to make unmodified PyTorch programs faster.☆1,053Updated last year
- Model compression for ONNX☆96Updated 7 months ago
- torch::deploy (multipy for non-torch uses) is a system that lets you get around the GIL problem by running multiple Python interpreters i…☆180Updated 3 weeks ago
- Visualize ONNX models with model-explorer☆36Updated last month
- The Triton backend for the PyTorch TorchScript models.☆154Updated last week
- Backward compatible ML compute opset inspired by HLO/MHLO☆497Updated last week
- A Fusion Code Generator for NVIDIA GPUs (commonly known as "nvFuser")☆343Updated this week
- A code generator from ONNX to PyTorch code☆138Updated 2 years ago
- Triton Model Analyzer is a CLI tool to help with better understanding of the compute and memory requirements of the Triton Inference Serv…☆479Updated last month
- Triton Model Navigator is an inference toolkit designed for optimizing and deploying Deep Learning models with a focus on NVIDIA GPUs.☆206Updated 2 months ago
- A parser, editor and profiler tool for ONNX models.☆442Updated last month
- Generative AI extensions for onnxruntime☆749Updated last week
- Scailable ONNX python tools☆96Updated 8 months ago
- A unified library of state-of-the-art model optimization techniques like quantization, pruning, distillation, speculative decoding, etc. …☆1,028Updated last week
- OpenAI Triton backend for Intel® GPUs☆191Updated this week
- PyTorch RFCs (experimental)☆133Updated last month
- cudnn_frontend provides a c++ wrapper for the cudnn backend API and samples on how to use it☆591Updated this week
- Inference Vision Transformer (ViT) in plain C/C++ with ggml☆288Updated last year
- The Triton backend for TensorRT.☆77Updated this week
- Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. Th…☆404Updated this week
- Common source, scripts and utilities for creating Triton backends.☆330Updated last week