quic / ai-hub-apps
The Qualcomm® AI Hub apps are a collection of state-of-the-art machine learning models optimized for performance (latency, memory etc.) and ready to deploy on Qualcomm® devices.
☆66Updated last week
Related projects ⓘ
Alternatives and complementary repositories for ai-hub-apps
- The Qualcomm® AI Hub Models are a collection of state-of-the-art machine learning models optimized for performance (latency, memory etc.)…☆502Updated last week
- Demonstration of running a native LLM on Android device.☆75Updated this week
- LiteRT is the new name for TensorFlow Lite (TFLite). While the name is new, it's still the same trusted, high-performance runtime for on-…☆155Updated this week
- Supporting PyTorch models with the Google AI Edge TFLite runtime.☆374Updated this week
- 使用Android手机的CPU推理stable diffusion☆145Updated 11 months ago
- llama.cpp tutorial on Android phone☆77Updated 3 months ago
- ☆105Updated last month
- On-device Speech Recognition for Android☆26Updated last month
- 本项目是一个通过文字生成图片的项目,基于开源模型Stable Diffusion V1.5生成可以在手机的CPU和NPU上运行的模型,包括其配套的模型运行框架。☆111Updated 7 months ago
- ☆22Updated 2 months ago
- An Android app running inference on Depth-Anything and Depth-Anything-V2☆33Updated 4 months ago
- Inference rwkv5 or rwkv6 with Qualcomm AI Engine Direct SDK☆38Updated this week
- On Device Stable Diffusion In Mobile Devices☆50Updated last year
- Run Generative AI models with simple C++/Python API and using OpenVINO Runtime☆155Updated this week
- A mobile Implementation of llama.cpp☆294Updated 9 months ago
- Generative AI extensions for onnxruntime☆520Updated this week
- Inference Llama 2 in one file of pure C☆40Updated last year
- workbench for learing&practising AI tech in real scenario on Android device, powered by GGML(Georgi Gerganov Machine Learning) and NCNN(T…☆127Updated 5 months ago
- High-performance, optimized pre-trained template AI application pipelines for systems using Hailo devices☆95Updated last month
- Add MobileSAM support for Inpaint anything using Segment Anything and inpainting models.☆48Updated last year
- Inference Vision Transformer (ViT) in plain C/C++ with ggml☆234Updated 7 months ago
- Fast Multimodal LLM on Mobile Devices☆537Updated this week
- a Android demo of depth_anything_v1 and depth_anything_v2☆53Updated 5 months ago
- ☆82Updated last year
- Optimized local inference for LLMs with HuggingFace-like APIs for quantization, vision/language models, multimodal agents, speech, vector…☆197Updated last month
- Python scripts for the Segment Anythin 2 (SAM2) model in ONNX☆177Updated 2 months ago
- Demonstration of MobileSAM in the browser enabled through ONNX runtime web☆91Updated last year
- GPU Accelerated TensorFlow Lite applications on Android NDK. Higher accuracy face detection, Age and gender estimation, Human pose estima…☆151Updated 3 years ago
- TensorRT Model Optimizer is a unified library of state-of-the-art model optimization techniques such as quantization, pruning, distillati…☆580Updated this week