Tencent / TNN
TNN: developed by Tencent Youtu Lab and Guangying Lab, a uniform deep learning inference framework for mobile、desktop and server. TNN is distinguished by several outstanding features, including its cross-platform capability, high performance, model compression and code pruning. Based on ncnn and Rapidnet, TNN further strengthens the support and …
☆4,508Updated last week
Alternatives and similar repositories for TNN
Users that are interested in TNN are comparing it to the libraries listed below
Sorting:
- ncnn is a high-performance neural network inference framework optimized for the mobile platform☆21,446Updated this week
- MNN is a blazing fast, lightweight deep learning framework, battle-tested by business-critical use cases in Alibaba. Full multimodal LLM …☆10,894Updated this week
- Tengine is a lite, high performance, modular inference engine for embedded device☆4,465Updated 2 months ago
- PaddlePaddle High Performance Deep Learning Inference Engine for Mobile and Edge (飞桨高性能深度学习端侧推理引擎)☆7,080Updated 3 weeks ago
- Bolt is a deep learning library with high performance and heterogeneous flexibility.☆944Updated last month
- MACE is a deep learning inference framework optimized for mobile heterogeneous computing platforms.☆5,010Updated 11 months ago
- 🛠 A lite C++ AI toolkit: 100+🎉 models (Stable-Diffusion, FaceFusion, YOLO series, Det, Seg, Matting) with MNN, ORT and TensorRT.☆4,083Updated 2 weeks ago
- PaddleSlim is an open-source library for deep model compression and architecture search.☆1,592Updated 5 months ago
- FeatherCNN is a high performance inference engine for convolutional neural networks.☆1,217Updated 5 years ago
- Simplify your onnx model☆4,074Updated 8 months ago
- A primitive library for neural network☆1,337Updated 5 months ago
- 😎 A Collection of Awesome NCNN-based Projects☆738Updated 2 years ago
- 🍅 Deploy ncnn on mobile phones. Support Android and iOS. 移动端ncnn部署,支持Android与iOS。☆1,537Updated 3 years ago
- The minimal opencv for Android, iOS, ARM Linux, Windows, Linux, MacOS, HarmonyOS, WebAssembly, watchOS, tvOS, visionOS☆2,845Updated this week
- ONNX-TensorRT: TensorRT backend for ONNX☆3,066Updated last week
- 一款推理框架,同时有很多有用的demo,觉得好用请点星啊☆2,218Updated 8 months ago
- Quantized Neural Network PACKage - mobile-optimized implementation of quantized neural network operators☆1,541Updated 5 years ago
- OpenMMLab Model Deployment Framework☆2,939Updated 7 months ago
- ⚡️An Easy-to-use and Fast Deep Learning Model Deployment Toolkit for ☁️Cloud 📱Mobile and 📹Edge. Including Image, Video, Text and Audio …☆3,179Updated 2 months ago
- PyTorch Neural Network eXchange☆581Updated 2 weeks ago
- 💎1MB lightweight face detection model (1MB轻量级人脸检测模型)☆7,343Updated last year
- NanoDet-Plus⚡Super fast and lightweight anchor-free object detection model. 🔥Only 980 KB(int8) / 1.8MB (fp16) and run 97FPS on cellphone…☆5,937Updated 9 months ago
- An Automatic Model Compression (AutoMC) framework for developing smaller and faster AI applications.☆2,881Updated 2 years ago
- MegEngine 是一个快速、可拓展、易于使用且支持自动求导的深度学习框架☆4,791Updated 6 months ago
- Deep learning model converter for PaddlePaddle. (『飞桨』深度学习模型转换工具)☆754Updated 2 months ago
- MNN applications by MNN, JNI exec, RK3399. Support tflite\tensorflow\caffe\onnx models.☆506Updated 5 years ago
- PPL Quantization Tool (PPQ) is a powerful offline neural network quantization tool.☆1,687Updated last year
- Open standard for machine learning interoperability☆18,949Updated this week
- oneAPI Deep Neural Network Library (oneDNN)☆3,790Updated this week
- ONNX Model Exporter for PaddlePaddle☆806Updated this week