quic / ai-hub-appsLinks
The Qualcomm® AI Hub apps are a collection of state-of-the-art machine learning models optimized for performance (latency, memory etc.) and ready to deploy on Qualcomm® devices.
☆216Updated 2 weeks ago
Alternatives and similar repositories for ai-hub-apps
Users that are interested in ai-hub-apps are comparing it to the libraries listed below
Sorting:
- The Qualcomm® AI Hub Models are a collection of state-of-the-art machine learning models optimized for performance (latency, memory etc.)…☆719Updated 2 weeks ago
- ☆143Updated 3 months ago
- LLM inference in C/C++☆42Updated last week
- Demonstration of running a native LLM on Android device.☆144Updated 2 weeks ago
- Supporting PyTorch models with the Google AI Edge TFLite runtime.☆678Updated this week
- QAI AppBuilder is designed to help developers easily execute models on WoS and Linux platforms. It encapsulates the Qualcomm® AI Runtime …☆50Updated this week
- A Toolkit to Help Optimize Onnx Model☆159Updated this week
- Demonstration of combine YOLO and depth estimation on Android device.☆51Updated last month
- 使用Android手机的CPU推理stable diffusion☆152Updated last year
- ☆35Updated 2 months ago
- llama.cpp tutorial on Android phone☆110Updated last month
- Inference RWKV v5, v6 and v7 with Qualcomm AI Engine Direct SDK☆72Updated last week
- 本项目是一个通过文字生成图片的项目,基于开源模型Stable Diffusion V1.5生成可以在手机的CPU和NPU上运行的模型,包括其配套的模型运行框架。☆202Updated last year
- LiteRT continues the legacy of TensorFlow Lite as the trusted, high-performance runtime for on-device AI. Now with LiteRT Next, we're exp…☆595Updated this week
- reference implementation of the backend for llama.cpp on Android phone equipped with Qualcomm's Hexagon NPU, details can be seen at http…☆23Updated this week
- Run Large Language Models on RK3588 with GPU-acceleration☆107Updated last year
- Fast Multimodal LLM on Mobile Devices☆929Updated last week
- Run Generative AI models with simple C++/Python API and using OpenVINO Runtime☆293Updated this week
- Generative AI extensions for onnxruntime☆740Updated this week
- IRIS is an android app for interfacing with GGUF / llama.cpp models locally.☆216Updated 4 months ago
- workbench for learning and practicing on-device AI technology in real scenario with online-TV on Android phone, powered by ggml(llama.cpp…☆166Updated 2 weeks ago
- Port of Facebook's LLaMA model in C/C++☆96Updated 2 weeks ago
- ☆838Updated last month
- On-device Speech Recognition for Android☆104Updated this week
- 基于MNN-llm的安卓手机部署大语言模型:Qwen1.5-0.5B-Chat☆79Updated last year
- Let's use Qualcomm NPU in Android☆10Updated 4 months ago
- A simple tutorial of SNPE.☆173Updated 2 years ago
- Awesome Mobile LLMs☆204Updated 3 weeks ago
- [EMNLP Findings 2024] MobileQuant: Mobile-friendly Quantization for On-device Language Models☆63Updated 9 months ago
- Low-bit LLM inference on CPU/NPU with lookup table☆811Updated 3 weeks ago