ARM-software / armnnLinks
Arm NN ML Software.
☆1,286Updated this week
Alternatives and similar repositories for armnn
Users that are interested in armnn are comparing it to the libraries listed below
Sorting:
- The Compute Library is a set of computer vision and machine learning functions optimised for both Arm CPUs and GPUs using SIMD technologi…☆3,072Updated this week
- Arm Machine Learning tutorials and examples☆475Updated last week
- ☆156Updated 9 months ago
- Quantized Neural Network PACKage - mobile-optimized implementation of quantized neural network operators☆1,547Updated 6 years ago
- VeriSilicon Tensor Interface Module☆240Updated last month
- Low-precision matrix multiplication☆1,816Updated last year
- Embedded and mobile deep learning research resources☆757Updated 2 years ago
- High-efficiency floating-point neural network inference operators for mobile, server, and Web☆2,170Updated last week
- An open optimized software library project for the ARM® Architecture☆1,512Updated 2 years ago
- ☆243Updated 2 years ago
- Benchmarking Neural Network Inference on Mobile Devices☆383Updated 2 years ago
- CMSIS-NN Library☆325Updated last month
- A lightweight, portable pure C99 onnx inference engine for embedded devices with hardware acceleration support.☆638Updated 3 months ago
- A list of ICs and IPs for AI, Machine Learning and Deep Learning.☆1,694Updated last year
- vendor independent TinyML deep learning library, compiler and inference framework microcomputers and micro-controllers☆599Updated 4 months ago
- NVDLA SW☆508Updated 4 years ago
- generate tflite micro code which bypasses the interpreter (directly calls into kernels)☆82Updated 3 years ago
- Bolt is a deep learning library with high performance and heterogeneous flexibility.☆953Updated 7 months ago
- OpenVX sample implementation☆147Updated last year
- Tensorflow Backend for ONNX☆1,325Updated last year
- MLPerf® Tiny is an ML benchmark suite for extremely low-power systems such as microcontrollers☆432Updated 3 months ago
- High performance Cross-platform Inference-engine, you could run Anakin on x86-cpu,arm, nv-gpu, amd-gpu,bitmain and cambricon devices.☆534Updated 3 years ago
- WeChat: NeuralTalk,Weekly report and awesome list of embedded-ai.☆379Updated 3 years ago
- Acuity Model Zoo☆147Updated 2 months ago
- Open deep learning compiler stack for Kendryte AI accelerators ✨☆823Updated this week
- ONNX Optimizer☆772Updated 3 weeks ago
- ☆1,019Updated last year
- Khronos OpenVX Tutorial Material☆246Updated 4 years ago
- Scripts to build a wheel and a Docker image containing a complete ML framework stack, including dependencies, for AArch64 CPUs, as well a…☆274Updated last week
- Reference implementations of MLPerf® inference benchmarks☆1,484Updated last week