sophgo / tpu_compilerLinks
cvitek ai compiler base on MLIR
☆23Updated 3 years ago
Alternatives and similar repositories for tpu_compiler
Users that are interested in tpu_compiler are comparing it to the libraries listed below
Sorting:
- Zhouyi model zoo☆104Updated last month
- VeriSilicon Tensor Interface Module☆241Updated last week
- DDK for Rockchip NPU☆68Updated 4 years ago
- Tengine Convert Tool supports converting multi framworks' models into tmfile that suitable for Tengine-Lite AI framework.☆92Updated 4 years ago
- The Pipeline example based on AX650N/AX8850 shows the software development skills of Image Processing, NPU, Codec, and Display modules, …☆12Updated 3 months ago
- An optimized neural network operator library for chips base on Xuantie CPU.☆96Updated last year
- linux bsp app & sample for axpi (ax620a)☆36Updated 2 years ago
- arm-neon☆92Updated last year
- armchina NPU parser☆40Updated last month
- Acuity Model Zoo☆149Updated 2 months ago
- Samples code for world class Artificial Intelligence SoCs for computer vision applications.☆275Updated last week
- Based of paper "Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference"☆65Updated 4 years ago
- Tencent NCNN with added CUDA support☆71Updated 4 years ago
- Tengine gemm tutorial, step by step☆13Updated 4 years ago
- ☆33Updated 2 years ago
- Sophgo AI chips driver and runtime library.☆24Updated 2 weeks ago
- arm neon 相关文档和指令意义☆246Updated 6 years ago
- A keras h5df to ncnn model converter☆88Updated 3 years ago
- ☆19Updated 3 months ago
- ☆36Updated last year
- ☆17Updated 5 years ago
- benchmark models for TNN, ncnn, MNN☆20Updated 5 years ago
- A Winograd based kernel for convolutions in deep learning framework☆15Updated 8 years ago
- symmetric int8 gemm☆67Updated 5 years ago
- 将MNN拆解的简易前向推理框架(for study!)☆23Updated 4 years ago
- ONNX converter and optimizer scirpts for Kneron hardware.☆40Updated 2 years ago
- armchina NPU Integration☆24Updated last month
- TVM tutorial☆66Updated 6 years ago
- ☆46Updated last year
- ncnn is a high-performance neural network inference framework optimized for the mobile platform☆123Updated last week