JDAI-CV / DNNLibraryLinks
Daquexian's NNAPI Library. ONNX + Android NNAPI
☆350Updated 5 years ago
Alternatives and similar repositories for DNNLibrary
Users that are interested in DNNLibrary are comparing it to the libraries listed below
Sorting:
- Benchmarking Neural Network Inference on Mobile Devices☆377Updated 2 years ago
- Mobile AI Compute Engine Model Zoo☆376Updated 4 years ago
- A quick view of high-performance convolution neural networks (CNNs) inference engines on mobile devices.☆150Updated 3 years ago
- benchmark for embededded-ai deep learning inference engines, such as NCNN / TNN / MNN / TensorFlow Lite etc.☆204Updated 4 years ago
- Optimizing Mobile Deep Learning on ARM GPU with TVM☆181Updated 6 years ago
- A very fast neural network computing framework optimized for mobile platforms.QQ group: 676883532 【验证信息输:绝影】☆268Updated 7 years ago
- WeChat: NeuralTalk,Weekly report and awesome list of embedded-ai.☆379Updated 3 years ago
- Embedded and Mobile Deployment☆71Updated 7 years ago
- Open Source Library for GPU-Accelerated Execution of Trained Deep Convolutional Neural Networks on Android☆540Updated 8 years ago
- Demonstration of using Caffe2 inside an Android application.☆347Updated 6 years ago
- use ncnn in Android and iOS, ncnn is a high-performance neural network inference framework optimized for the mobile platform☆298Updated 11 months ago
- Generate a quantization parameter file for ncnn framework int8 inference☆518Updated 5 years ago
- Heterogeneous Run Time version of Caffe. Added heterogeneous capabilities to the Caffe, uses heterogeneous computing infrastructure frame…☆269Updated 6 years ago
- ☆209Updated 7 years ago
- Acuity Model Zoo☆145Updated 2 years ago
- [Deprecated] Convert caffemodel to DNNLibrary's daq format☆18Updated 6 years ago
- Minimal runtime core of Caffe, Forward only, GPU support and Memory efficiency.☆373Updated 5 years ago
- Optimized (for size and speed) Caffe lib for iOS and Android with out-of-the-box demo APP.☆315Updated 7 years ago
- Porting caffe to android platform☆507Updated 6 years ago
- ☆46Updated 4 years ago
- simple android demo.mobilefacenet.ncnn☆119Updated 7 years ago
- High performance Cross-platform Inference-engine, you could run Anakin on x86-cpu,arm, nv-gpu, amd-gpu,bitmain and cambricon devices.☆533Updated 2 years ago
- 基于ncnn框架搭建win及android端的MTCNN人脸检测工程☆528Updated 6 years ago
- ☆28Updated 4 years ago
- Benchmark of TVM quantized model on CUDA☆111Updated 5 years ago
- Demonstrate Plugin API for TensorRT2.1☆182Updated 7 years ago
- Tengine gemm tutorial, step by step☆13Updated 4 years ago
- caffe model convert to onnx model☆176Updated 2 years ago
- Pytorch model to caffe & ncnn☆393Updated 7 years ago
- MTCNN C++ implementation with NVIDIA TensorRT Inference accelerator SDK☆202Updated 5 years ago