Daquexian's NNAPI Library. ONNX + Android NNAPI
☆350Feb 20, 2020Updated 6 years ago
Alternatives and similar repositories for DNNLibrary
Users that are interested in DNNLibrary are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- [Deprecated] Convert caffemodel to DNNLibrary's daq format☆18Sep 1, 2018Updated 7 years ago
- An example app of DNNLibrary :)☆13Jul 26, 2019Updated 6 years ago
- dabnn is an accelerated binary neural networks inference framework for mobile platform☆778Nov 12, 2019Updated 6 years ago
- Generate a quantization parameter file for ncnn framework int8 inference☆518Jul 29, 2020Updated 5 years ago
- The benchmark of ncnn that is a high-performance neural network inference framework optimized for the mobile platform☆72Mar 8, 2019Updated 7 years ago
- FeatherCNN is a high performance inference engine for convolutional neural networks.☆1,226Sep 24, 2019Updated 6 years ago
- High performance Cross-platform Inference-engine, you could run Anakin on x86-cpu,arm, nv-gpu, amd-gpu,bitmain and cambricon devices.☆535Sep 23, 2022Updated 3 years ago
- This a bridge for converting torch,and other AI training framework to C++ speed up infer library,like NCNN and ect☆20Mar 24, 2019Updated 6 years ago
- Quantized Neural Network PACKage - mobile-optimized implementation of quantized neural network operators☆1,549Aug 28, 2019Updated 6 years ago
- Computation using data flow graphs for scalable machine learning☆25Apr 25, 2017Updated 8 years ago
- Minimal runtime core of Caffe, Forward only, GPU support and Memory efficiency.☆375Jul 15, 2020Updated 5 years ago
- ncnn is a high-performance neural network inference framework optimized for the mobile platform☆14May 20, 2022Updated 3 years ago
- MACE is a deep learning inference framework optimized for mobile heterogeneous computing platforms.☆5,035Jun 17, 2024Updated last year
- ☆14Dec 31, 2018Updated 7 years ago
- Heterogeneous Run Time version of Caffe. Added heterogeneous capabilities to the Caffe, uses heterogeneous computing infrastructure frame…☆269Oct 16, 2018Updated 7 years ago
- Pytorch model to caffe & ncnn☆394Jun 27, 2018Updated 7 years ago
- Arm NN ML Software.☆1,301Jan 23, 2026Updated 2 months ago
- arm neon 相关文档和指令意义☆247May 21, 2019Updated 6 years ago
- Bolt is a deep learning library with high performance and heterogeneous flexibility.☆956Apr 11, 2025Updated 11 months ago
- PCN based on ncnn framework.☆81Dec 21, 2018Updated 7 years ago
- A very fast neural network computing framework optimized for mobile platforms.QQ group: 676883532 【验证信息输:绝影】☆268Jan 4, 2018Updated 8 years ago
- Benchmarking Neural Network Inference on Mobile Devices☆386Apr 10, 2023Updated 2 years ago
- Tengine is a lite, high performance, modular inference engine for embedded device☆4,511Mar 6, 2025Updated last year
- ncnn example: mask detection: anticonv face detection: retinaface&&mtcnn&¢erface, track: iou tracking, landmark: zqcnn, recognize: m…☆476Jun 22, 2022Updated 3 years ago
- makefile 交叉编译 libmace.a,并能在嵌入式端调用GPU来跑深度学习模型☆96Aug 23, 2018Updated 7 years ago
- ☆46Jul 18, 2024Updated last year
- symmetric int8 gemm☆67Jun 7, 2020Updated 5 years ago
- Try to export the ONNX QDQ model that conforms to the AXERA NPU quantization specification. Currently, only w8a8 is supported.☆11Sep 10, 2024Updated last year
- ☆46Nov 25, 2024Updated last year
- Caffe Implementation of Google's MobileNets (v1 and v2)☆1,274Jun 8, 2021Updated 4 years ago
- A caffe implementation of Mnasnet: MnasNet: Platform-Aware Neural Architecture Search for Mobile.☆52Oct 19, 2018Updated 7 years ago
- 基于opengles的神经网络前向传播框架☆19Nov 23, 2018Updated 7 years ago
- MNN MTCNN C++☆49Jul 31, 2019Updated 6 years ago
- MNN: A blazing-fast, lightweight inference engine battle-tested by Alibaba, powering high-performance on-device LLMs and Edge AI.☆14,618Updated this week
- Low-precision matrix multiplication☆1,835Jan 29, 2024Updated 2 years ago
- Simplify your onnx model☆4,309Feb 26, 2026Updated 3 weeks ago
- RetinaFace detector with C++☆396Jun 19, 2019Updated 6 years ago
- MMdnn is a set of tools to help users inter-operate among different deep learning frameworks. E.g. model conversion and visualization. Co…☆5,813Aug 7, 2025Updated 7 months ago
- ☆23Dec 8, 2022Updated 3 years ago