ARM-software / ComputeLibraryLinks
The Compute Library is a set of computer vision and machine learning functions optimised for both Arm CPUs and GPUs using SIMD technologies.
☆3,110Updated last week
Alternatives and similar repositories for ComputeLibrary
Users that are interested in ComputeLibrary are comparing it to the libraries listed below
Sorting:
- Arm NN ML Software.☆1,296Updated 2 weeks ago
- Low-precision matrix multiplication☆1,832Updated 2 years ago
- An open optimized software library project for the ARM® Architecture☆1,529Updated 3 years ago
- oneAPI Deep Neural Network Library (oneDNN)☆3,960Updated this week
- High-efficiency floating-point neural network inference operators for mobile, server, and Web☆2,245Updated this week
- C++ image processing and machine learning library with using of SIMD: SSE, AVX, AVX-512, AMX for x86/x64, NEON for ARM.☆2,231Updated this week
- Quantized Neural Network PACKage - mobile-optimized implementation of quantized neural network operators☆1,550Updated 6 years ago
- Makes ARM NEON documentation accessible (with examples)☆406Updated last year
- Acceleration package for neural networks on multi-core CPUs☆1,703Updated last year
- Arm Machine Learning tutorials and examples☆481Updated this week
- ☆1,984Updated 2 years ago
- FeatherCNN is a high performance inference engine for convolutional neural networks.☆1,229Updated 6 years ago
- The platform independent header allowing to compile any C/C++ code containing ARM NEON intrinsic functions for x86 target systems using S…☆485Updated 3 months ago
- A list of ICs and IPs for AI, Machine Learning and Deep Learning.☆1,700Updated last year
- Compiler for Neural Network hardware accelerators☆3,326Updated last year
- Khronos OpenCL-Headers☆749Updated last week
- Tuned OpenCL BLAS☆1,164Updated this week
- Compute Library for Deep Neural Networks (clDNN)☆576Updated 3 years ago
- ☆157Updated 11 months ago
- TinyML AI inference library☆1,902Updated 8 months ago
- nGraph has moved to OpenVINO☆1,346Updated 5 years ago
- Embedded and mobile deep learning research resources☆761Updated 2 years ago
- MACE is a deep learning inference framework optimized for mobile heterogeneous computing platforms.☆5,035Updated last year
- Benchmarking Neural Network Inference on Mobile Devices☆386Updated 2 years ago
- AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.☆2,552Updated this week
- Khronos OpenVX Tutorial Material☆247Updated 4 years ago
- OpenBLAS is an optimized BLAS library based on GotoBLAS2 1.13 BSD version.☆7,253Updated last week
- Tengine is a lite, high performance, modular inference engine for embedded device☆4,504Updated 11 months ago
- a software library containing BLAS functions written in OpenCL☆863Updated last year
- A C++ GPU Computing Library for OpenCL☆1,644Updated last month