Shishir435 / ollama-clientView on GitHub
Ollama Client – Chat with Local LLMs Inside Your Browser A lightweight, privacy‑first Chrome extension to chat with local LLMs via Ollama, LM Studio, and llama.cpp. Supports streaming, stop/regenerate, RAG, and easy model switching — all without cloud APIs or data leaks.
26Feb 8, 2026Updated 2 weeks ago

Alternatives and similar repositories for ollama-client

Users that are interested in ollama-client are comparing it to the libraries listed below

Sorting:

Are these results useful?