ggozad / oterm
a text-based terminal client for Ollama
☆1,428Updated this week
Alternatives and similar repositories for oterm:
Users that are interested in oterm are comparing it to the libraries listed below
- A snappy, keyboard-centric terminal user interface for interacting with large language models. Chat with ChatGPT, Claude, Llama 3, Phi 3,…☆2,071Updated 5 months ago
- Proxy that allows you to use ollama as a copilot like Github copilot☆570Updated last month
- Simple HTML UI for Ollama☆995Updated last month
- Terminal UI to chat with large language models (LLM) using different model backends, and integrations with your favourite editors!☆593Updated 9 months ago
- Replace Copilot local AI☆1,978Updated 10 months ago
- Go manage your Ollama models☆960Updated 2 weeks ago
- A Web Interface for chatting with your local LLMs via the ollama API☆945Updated 3 weeks ago
- BionicGPT is an on-premise replacement for ChatGPT, offering the advantages of Generative AI while maintaining strict data confidentialit…☆2,122Updated last month
- A multi-platform desktop application to evaluate and compare LLM models, written in Rust and React.☆672Updated 2 months ago
- Pipelines: Versatile, UI-Agnostic OpenAI-Compatible Plugin Framework☆1,575Updated this week
- Text-To-Speech, RAG, and LLMs. All local!☆1,762Updated 3 months ago
- Stateful load balancer custom-tailored for llama.cpp 🏓🦙☆728Updated this week
- Chatbot Ollama is an open source chat UI for Ollama.☆1,666Updated this week
- VS Code extension for LLM-assisted code/text completion☆608Updated this week
- Local voice chatbot for engaging conversations, powered by Ollama, Hugging Face Transformers, and Coqui TTS Toolkit☆755Updated 7 months ago
- WIP: Open WebUI desktop application, based on Electron.☆301Updated 2 months ago
- Vim plugin for LLM-assisted code/text completion☆1,290Updated last week
- LLM plugin providing access to models running on an Ollama server☆256Updated 2 weeks ago
- Chat with your documents using local AI☆1,016Updated 8 months ago
- A fast Rust based tool to serialize text-based files in a repository or directory for LLM consumption☆1,809Updated this week
- An application for running LLMs locally on your device, with your documents, facilitating detailed citations in generated responses.☆566Updated 4 months ago
- Apple MLX engine for LM Studio☆466Updated this week
- Effortlessly run LLM backends, APIs, frontends, and services with one command.☆1,483Updated this week
- Command line utility to make you a magician in the terminal☆745Updated 7 months ago
- Local AI API Platform☆2,563Updated this week
- Enchanted is iOS and macOS app for chatting with private self hosted language models such as Llama2, Mistral or Vicuna using Ollama.☆5,052Updated this week
- A proxy server for multiple ollama instances with Key security☆365Updated last month
- Blazingly fast LLM inference.☆5,240Updated this week
- Local CLI Copilot, powered by Ollama. 💻🦙☆1,391Updated last month
- Easily create LLM tools and agents using plain Bash/JavaScript/Python functions.☆447Updated last month