Tencent / TNNLinks
TNN: developed by Tencent Youtu Lab and Guangying Lab, a uniform deep learning inference framework for mobile、desktop and server. TNN is distinguished by several outstanding features, including its cross-platform capability, high performance, model compression and code pruning. Based on ncnn and Rapidnet, TNN further strengthens the support and …
☆4,538Updated last month
Alternatives and similar repositories for TNN
Users that are interested in TNN are comparing it to the libraries listed below
Sorting:
- ncnn is a high-performance neural network inference framework optimized for the mobile platform☆21,746Updated this week
- MNN is a blazing fast, lightweight deep learning framework, battle-tested by business-critical use cases in Alibaba. Full multimodal LLM …☆12,218Updated this week
- Tengine is a lite, high performance, modular inference engine for embedded device☆4,474Updated 4 months ago
- MACE is a deep learning inference framework optimized for mobile heterogeneous computing platforms.☆5,014Updated last year
- The minimal opencv for Android, iOS, ARM Linux, Windows, Linux, MacOS, HarmonyOS, WebAssembly, watchOS, tvOS, visionOS☆2,936Updated last month
- Bolt is a deep learning library with high performance and heterogeneous flexibility.☆950Updated 2 months ago
- 😎 A Collection of Awesome NCNN-based Projects☆742Updated 2 years ago
- FeatherCNN is a high performance inference engine for convolutional neural networks.☆1,220Updated 5 years ago
- 🛠 A lite C++ AI toolkit: 100+ models with MNN, ORT and TRT, including Det, Seg, Stable-Diffusion, Face-Fusion, etc.🎉☆4,148Updated this week
- Simplify your onnx model☆4,110Updated 10 months ago
- 🍅 Deploy ncnn on mobile phones. Support Android and iOS. 移动端ncnn部署,支持Android与iOS。☆1,549Updated 3 years ago
- Open deep learning compiler stack for cpu, gpu and specialized accelerators☆12,411Updated this week
- 一款推理框架,同时有很多有用的demo,觉得好用请点星啊☆2,217Updated 10 months ago
- An Automatic Model Compression (AutoMC) framework for developing smaller and faster AI applications.☆2,902Updated 2 years ago
- A primitive library for neural network☆1,344Updated 7 months ago
- ONNX-TensorRT: TensorRT backend for ONNX☆3,103Updated 3 weeks ago
- PaddlePaddle High Performance Deep Learning Inference Engine for Mobile and Edge (飞桨高性能深度学习端侧推理引擎)☆7,119Updated last month
- Quantized Neural Network PACKage - mobile-optimized implementation of quantized neural network operators☆1,541Updated 5 years ago
- MNN applications by MNN, JNI exec, RK3399. Support tflite\tensorflow\caffe\onnx models.☆507Updated 5 years ago
- High-performance Inference and Deployment Toolkit for LLMs and VLMs based on PaddlePaddle☆3,362Updated this week
- SeetaFace 2: open source, full stack face recognization toolkit.☆2,191Updated last year
- PyTorch Neural Network eXchange☆598Updated this week
- CV-CUDA™ is an open-source, GPU accelerated library for cloud-scale image processing and computer vision.☆2,529Updated last month
- Based on yolo's ultra-lightweight universal target detection algorithm, the calculation amount is only 250mflops, the ncnn model size is…☆2,055Updated 3 years ago
- An easy to use PyTorch to TensorRT converter☆4,768Updated 10 months ago
- PPL Quantization Tool (PPQ) is a powerful offline neural network quantization tool.☆1,700Updated last year
- micronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantiz…☆2,253Updated 2 months ago
- NVIDIA® TensorRT™ is an SDK for high-performance deep learning inference on NVIDIA GPUs. This repository contains the open source compone…☆11,804Updated this week
- PaddleSlim is an open-source library for deep model compression and architecture search.☆1,596Updated last week
- lib, demo, model, data☆744Updated last month