jegly / OfflineLLMView on GitHub
A privacy-first Android chat app that runs large language models entirely on-device. No internet, no cloud, no tracking. Built with Kotlin, Jetpack Compose, and llama.cpp with optimized ARM NEON/SVE inference.
79Apr 19, 2026Updated this week

Alternatives and similar repositories for OfflineLLM

Users that are interested in OfflineLLM are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?