OAID / Tengine
Tengine is a lite, high performance, modular inference engine for embedded device
☆4,449Updated last week
Alternatives and similar repositories for Tengine:
Users that are interested in Tengine are comparing it to the libraries listed below
- AutoKernel 是一个简单易用,低门槛的自动算子优化工具,提高深度学习算法部署效率。☆736Updated 2 years ago
- TengineKit - Free, Fast, Easy, Real-Time Face Detection & Face Landmarks & Face Attributes & Hand Detection & Hand Landmarks & Body Detec…☆2,291Updated 3 years ago
- TNN: developed by Tencent Youtu Lab and Guangying Lab, a uniform deep learning inference framework for mobile、desktop and server. TNN is …☆4,472Updated 2 months ago
- Bolt is a deep learning library with high performance and heterogeneous flexibility.☆940Updated 7 months ago
- Arm NN ML Software. The code here is a read-only mirror of https://review.mlplatform.org/admin/repos/ml/armnn☆1,238Updated this week
- Quantized Neural Network PACKage - mobile-optimized implementation of quantized neural network operators☆1,536Updated 5 years ago
- 😎 A Collection of Awesome NCNN-based Projects☆731Updated 2 years ago
- The Compute Library is a set of computer vision and machine learning functions optimised for both Arm CPUs and GPUs using SIMD technologi…☆2,934Updated last week
- A primitive library for neural network☆1,321Updated 3 months ago
- 🍅🍅🍅YOLOv5-Lite: Evolved from yolov5 and the size of model is only 900+kb (int8) and 1.7M (fp16). Reach 15 FPS on the Raspberry Pi 4B~☆2,334Updated 8 months ago
- dabnn is an accelerated binary neural networks inference framework for mobile platform☆775Updated 5 years ago
- 🔥 (yolov3 yolov4 yolov5 unet ...)A mini pytorch inference framework which inspired from darknet.☆744Updated last year
- MobileNetV2-YoloV3-Nano: 0.5BFlops 3MB HUAWEI P40: 6ms/img, YoloFace-500k:0.1Bflops 420KB☆1,724Updated 4 years ago
- Based on yolo's ultra-lightweight universal target detection algorithm, the calculation amount is only 250mflops, the ncnn model size is…☆2,037Updated 3 years ago
- OneFlow is a deep learning framework designed to be user-friendly, scalable and efficient.☆7,375Updated this week
- ncnn is a high-performance neural network inference framework optimized for the mobile platform☆21,097Updated this week
- ☆864Updated last year
- MNN applications by MNN, JNI exec, RK3399. Support tflite\tensorflow\caffe\onnx models.☆504Updated 5 years ago
- ONNX-TensorRT: TensorRT backend for ONNX☆3,034Updated last week
- micronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantiz…☆2,228Updated 3 years ago
- ppl.cv is a high-performance image processing library of openPPL supporting various platforms.☆500Updated 4 months ago
- FeatherCNN is a high performance inference engine for convolutional neural networks.☆1,216Updated 5 years ago
- The minimal opencv for Android, iOS, ARM Linux, Windows, Linux, MacOS, HarmonyOS, WebAssembly, watchOS, tvOS, visionOS☆2,752Updated last month
- PaddleSlim is an open-source library for deep model compression and architecture search.☆1,582Updated 3 months ago
- MNN is a blazing fast, lightweight deep learning framework, battle-tested by business-critical use cases in Alibaba. Full multimodal LLM …☆9,989Updated this week
- Simplify your onnx model☆3,997Updated 6 months ago
- MACE is a deep learning inference framework optimized for mobile heterogeneous computing platforms.☆5,000Updated 8 months ago
- PaddlePaddle High Performance Deep Learning Inference Engine for Mobile and Edge (飞桨高性能深度学习端侧推理引擎)☆7,047Updated last month
- ☆241Updated 2 months ago
- A flexible, high-performance carrier for machine learning models(『飞桨』服务化部署框架)☆903Updated 10 months ago