ggozad / otermLinks
the terminal client for Ollama
☆2,270Updated last month
Alternatives and similar repositories for oterm
Users that are interested in oterm are comparing it to the libraries listed below
Sorting:
- Proxy that allows you to use ollama as a copilot like Github copilot☆796Updated 2 months ago
- Go manage your Ollama models☆1,593Updated 3 weeks ago
- A snappy, keyboard-centric terminal user interface for interacting with large language models. Chat with ChatGPT, Claude, Llama 3, Phi 3,…☆2,376Updated last year
- A multi-platform desktop application to evaluate and compare LLM models, written in Rust and React.☆878Updated last week
- Replace Copilot local AI☆2,071Updated last year
- Simple HTML UI for Ollama☆1,099Updated 3 months ago
- Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools☆945Updated 7 months ago
- Effortlessly run LLM backends, APIs, frontends, and services with one command.☆2,161Updated this week
- Reliable model swapping for any local OpenAI compatible server - llama.cpp, vllm, etc☆1,933Updated last week
- VS Code extension for LLM-assisted code/text completion☆1,072Updated 2 weeks ago
- LSP-AI is an open-source language server that serves as a backend for AI-powered functionality, designed to assist and empower software e…☆3,050Updated 10 months ago
- Local AI API Platform☆2,764Updated 4 months ago
- Local CLI Copilot, powered by Ollama. 💻🦙☆1,461Updated last month
- Easily create LLM tools and agents using plain Bash/JavaScript/Python functions.☆678Updated 5 months ago
- A CLI host application that enables Large Language Models (LLMs) to interact with external tools through the Model Context Protocol (MCP)…☆1,477Updated 2 weeks ago
- LM Studio CLI☆3,934Updated this week
- A Web Interface for chatting with your local LLMs via the ollama API☆1,158Updated 3 months ago
- A minimal LLM chat app that runs entirely in your browser☆1,036Updated last month
- Pipelines: Versatile, UI-Agnostic OpenAI-Compatible Plugin Framework☆2,186Updated 3 months ago
- RamaLama is an open-source developer tool that simplifies the local serving of AI models from any source and facilitates their use for in…☆2,343Updated last week
- Chat with your documents using local AI☆1,074Updated last year
- The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but 100% free.☆3,623Updated 3 months ago
- AI-Powered, Non-Intrusive Terminal Assistant☆1,327Updated this week
- Bionic is an on-premise replacement for ChatGPT, offering the advantages of Generative AI while maintaining strict data confidentiality☆2,274Updated last week
- Distributed LLM inference. Connect home devices into a powerful cluster to accelerate LLM inference. More devices means faster inference.☆2,755Updated last month
- OpenAPI Tool Servers☆764Updated 2 months ago
- LLM plugin providing access to models running on an Ollama server☆343Updated last month
- LM Studio TypeScript SDK☆1,421Updated this week
- A proxy server for multiple ollama instances with Key security☆540Updated 2 weeks ago
- Mac compatible Ollama Voice☆511Updated 3 months ago