alibaba / MNN
MNN is a blazing fast, lightweight deep learning framework, battle-tested by business-critical use cases in Alibaba. Full multimodal LLM Android App:[MNN-LLM-Android](./apps/Android/MnnLlmChat/README.md)
☆10,050Updated this week
Alternatives and similar repositories for MNN:
Users that are interested in MNN are comparing it to the libraries listed below
- ncnn is a high-performance neural network inference framework optimized for the mobile platform☆21,150Updated this week
- TNN: developed by Tencent Youtu Lab and Guangying Lab, a uniform deep learning inference framework for mobile、desktop and server. TNN is …☆4,479Updated this week
- MACE is a deep learning inference framework optimized for mobile heterogeneous computing platforms.☆4,999Updated 9 months ago
- PaddlePaddle High Performance Deep Learning Inference Engine for Mobile and Edge (飞桨高性能深度学习端侧推理引擎)☆7,053Updated 2 months ago
- 🛠 A lite C++ toolkit of 100+ Awesome AI models, support ORT, MNN, NCNN, TNN and TensorRT. 🎉🎉☆3,986Updated 2 weeks ago
- NVIDIA® TensorRT™ is an SDK for high-performance deep learning inference on NVIDIA GPUs. This repository contains the open source compone…☆11,350Updated last week
- Open deep learning compiler stack for cpu, gpu and specialized accelerators☆12,134Updated this week
- Bolt is a deep learning library with high performance and heterogeneous flexibility.☆940Updated 7 months ago
- An Automatic Model Compression (AutoMC) framework for developing smaller and faster AI applications.☆2,816Updated last year
- The minimal opencv for Android, iOS, ARM Linux, Windows, Linux, MacOS, HarmonyOS, WebAssembly, watchOS, tvOS, visionOS☆2,759Updated 2 months ago
- Quantized Neural Network PACKage - mobile-optimized implementation of quantized neural network operators☆1,537Updated 5 years ago
- Simplify your onnx model☆4,004Updated 6 months ago
- Open standard for machine learning interoperability☆18,658Updated this week
- Tengine is a lite, high performance, modular inference engine for embedded device☆4,452Updated 2 weeks ago
- ONNX-TensorRT: TensorRT backend for ONNX☆3,044Updated 2 weeks ago
- FeatherCNN is a high performance inference engine for convolutional neural networks.☆1,217Updated 5 years ago
- ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator☆16,061Updated this week
- Convert TensorFlow, Keras, Tensorflow.js and Tflite models to ONNX☆2,386Updated last month
- A high performance and generic framework for distributed DNN training☆3,668Updated last year
- MMdnn is a set of tools to help users inter-operate among different deep learning frameworks. E.g. model conversion and visualization. Co…☆5,810Updated 9 months ago
- 🍅 Deploy ncnn on mobile phones. Support Android and iOS. 移动端ncnn部署,支持Android与iOS。☆1,520Updated 2 years ago
- High-efficiency floating-point neural network inference operators for mobile, server, and Web☆1,985Updated this week
- Tensorflow Backend for ONNX☆1,296Updated 11 months ago
- NanoDet-Plus⚡Super fast and lightweight anchor-free object detection model. 🔥Only 980 KB(int8) / 1.8MB (fp16) and run 97FPS on cellphone…☆5,891Updated 7 months ago
- An open source library for face detection in images. The face detection speed can reach 1000FPS.☆12,438Updated 5 months ago
- An easy to use PyTorch to TensorRT converter☆4,694Updated 7 months ago
- LightSeq: A High Performance Library for Sequence Processing and Generation☆3,260Updated last year
- Jittor is a high-performance deep learning framework based on JIT compiling and meta-operators.☆3,129Updated this week
- A primitive library for neural network☆1,324Updated 3 months ago
- The Compute Library is a set of computer vision and machine learning functions optimised for both Arm CPUs and GPUs using SIMD technologi…☆2,937Updated 2 weeks ago