intel / inference-engine-nodeLinks
Bringing the hardware accelerated deep learning inference to Node.js and Electron.js apps.
☆32Updated 2 years ago
Alternatives and similar repositories for inference-engine-node
Users that are interested in inference-engine-node are comparing it to the libraries listed below
Sorting:
- ☆57Updated 4 years ago
- This repository contains the results and code for the MLPerf™ Inference v0.5 benchmark.☆55Updated 2 months ago
- Optimizing Mobile Deep Learning on ARM GPU with TVM☆181Updated 6 years ago
- A quick view of high-performance convolution neural networks (CNNs) inference engines on mobile devices.☆151Updated 3 years ago
- Explore the Capabilities of the TensorRT Platform☆264Updated 4 years ago
- Benchmark of TVM quantized model on CUDA☆111Updated 5 years ago
- ☆42Updated 6 years ago
- Heterogeneous Run Time version of TensorFlow. Added heterogeneous capabilities to the TensorFlow, uses heterogeneous computing infrastruc…☆36Updated 7 years ago
- To make it easy to benchmark AI accelerators☆189Updated 2 years ago
- benchmark for embededded-ai deep learning inference engines, such as NCNN / TNN / MNN / TensorFlow Lite etc.☆204Updated 4 years ago
- TensorFlow and TVM integration☆37Updated 5 years ago
- tophub autotvm log collections☆69Updated 2 years ago
- Parallel CUDA implementation of NON maximum Suppression☆80Updated 5 years ago
- Demonstrate Plugin API for TensorRT2.1☆182Updated 8 years ago
- Daquexian's NNAPI Library. ONNX + Android NNAPI☆351Updated 5 years ago
- Code for testing the native float16 matrix multiplication performance on Tesla P100 and V100 GPU based on cublasHgemm☆34Updated 6 years ago
- DeepDetect performance sheet☆93Updated 6 years ago
- WeChat: NeuralTalk,Weekly report and awesome list of embedded-ai.☆379Updated 3 years ago
- OpenVINO ARM64 Notes☆49Updated 5 years ago
- Create a concurrent video analysis pipeline featuring multistream face and human pose detection, vehicle attribute detection, and the abi…☆52Updated last year
- Tencent NCNN with added CUDA support☆69Updated 4 years ago
- OpenVINO™ integration with TensorFlow☆179Updated last year
- Deep Learning Inference benchmark. Supports OpenVINO™ toolkit, TensorFlow, TensorFlow Lite, ONNX Runtime, OpenCV DNN, MXNet, PyTorch, Apa…☆33Updated last month
- Simulate quantization and quantization aware training for MXNet-Gluon models.☆45Updated 5 years ago
- Repository for OpenVINO's extra modules☆139Updated this week
- Simple Training and Deployment of Fast End-to-End Binary Networks☆158Updated 3 years ago
- ☆45Updated last year
- Tengine Convert Tool supports converting multi framworks' models into tmfile that suitable for Tengine-Lite AI framework.☆92Updated 4 years ago
- Training Toolbox for Caffe☆48Updated last year
- NVIDIA DeepStream SDK☆28Updated 6 years ago