OAID / TengineLinks
Tengine is a lite, high performance, modular inference engine for embedded device
☆4,506Updated 10 months ago
Alternatives and similar repositories for Tengine
Users that are interested in Tengine are comparing it to the libraries listed below
Sorting:
- TengineKit - Free, Fast, Easy, Real-Time Face Detection & Face Landmarks & Face Attributes & Hand Detection & Hand Landmarks & Body Detec…☆2,319Updated 4 years ago
- AutoKernel 是一个简单易用,低门槛的自动算子优化工具,提高深度学习算法部署效率。☆743Updated 3 years ago
- TNN: developed by Tencent Youtu Lab and Guangying Lab, a uniform deep learning inference framework for mobile、desktop and server. TNN is …☆4,610Updated 8 months ago
- 🔥 (yolov3 yolov4 yolov5 unet ...)A mini pytorch inference framework which inspired from darknet.☆742Updated 2 years ago
- Bolt is a deep learning library with high performance and heterogeneous flexibility.☆955Updated 9 months ago
- OneFlow is a deep learning framework designed to be user-friendly, scalable and efficient.☆9,386Updated last month
- FeatherCNN is a high performance inference engine for convolutional neural networks.☆1,227Updated 6 years ago
- 🍅🍅🍅YOLOv5-Lite: Evolved from yolov5 and the size of model is only 900+kb (int8) and 1.7M (fp16). Reach 15 FPS on the Raspberry Pi 4B~☆2,467Updated last year
- Quantized Neural Network PACKage - mobile-optimized implementation of quantized neural network operators☆1,551Updated 6 years ago
- TVM Documentation in Chinese Simplified / TVM 中文文档☆3,041Updated last month
- Arm NN ML Software.☆1,299Updated this week
- MobileNetV2-YoloV3-Nano: 0.5BFlops 3MB HUAWEI P40: 6ms/img, YoloFace-500k:0.1Bflops 420KB☆1,748Updated 4 years ago
- 一款推理框架,同时有很多有用的demo,觉得好用请点星啊☆2,217Updated last year
- MACE is a deep learning inference framework optimized for mobile heterogeneous computing platforms.☆5,029Updated last year
- ncnn is a high-performance neural network inference framework optimized for the mobile platform☆22,606Updated last week
- A primitive library for neural network☆1,368Updated last year
- Adlik: Toolkit for Accelerating Deep Learning Inference☆810Updated 2 years ago
- 🛠A lite C++ AI toolkit: 100+ models with MNN, ORT and TRT, including Det, Seg, Stable-Diffusion, Face-Fusion, etc.🎉☆4,347Updated last month
- Based on yolo's ultra-lightweight universal target detection algorithm, the calculation amount is only 250mflops, the ncnn model size is…☆2,087Updated 4 years ago
- Simplify your onnx model☆4,272Updated last week
- ☆939Updated 2 years ago
- A lightweight, portable pure C99 onnx inference engine for embedded devices with hardware acceleration support.☆645Updated 5 months ago
- 😎 A Collection of Awesome NCNN-based Projects☆750Updated 3 years ago
- High performance Cross-platform Inference-engine, you could run Anakin on x86-cpu,arm, nv-gpu, amd-gpu,bitmain and cambricon devices.☆535Updated 3 years ago
- An Automatic Model Compression (AutoMC) framework for developing smaller and faster AI applications.☆2,913Updated 2 years ago
- ☆241Updated 3 years ago
- MNN applications by MNN, JNI exec, RK3399. Support tflite\tensorflow\caffe\onnx models.☆510Updated 6 years ago
- ONNX-TensorRT: TensorRT backend for ONNX☆3,180Updated 2 months ago
- darknet深度学习框架源码分析:详细中文注释,涵盖框架原理与实现语法分析☆1,605Updated 7 years ago
- MNN is a blazing fast, lightweight deep learning framework, battle-tested by business-critical use cases in Alibaba. Full multimodal LLM …☆13,950Updated this week