Gaurav-Gosain / gollamaLinks
Gollama: Your offline conversational AI companion. An interactive tool for generating creative responses from various models, right in your terminal. Ideal for brainstorming, creative writing, or seeking inspiration.
☆146Updated 6 months ago
Alternatives and similar repositories for gollama
Users that are interested in gollama are comparing it to the libraries listed below
Sorting:
- 100% Local Memory layer and Knowledge base for agents with WebUI☆234Updated last month
- Create Linux commands from natural language, in the shell.☆111Updated last week
- TUI for Ollama and other LLM providers☆338Updated 2 weeks ago
- Parse files (e.g. code repos) and websites to clipboard or a file for ingestions by AI / LLMs☆277Updated last month
- A modern, POSIX-compatible, generative shell☆269Updated 3 months ago
- Link you Ollama models to LM-Studio☆141Updated last year
- Manage and use multiple Ollama instances with automatic offline detection/failover and model availability tracking☆80Updated 5 months ago
- VSCode AI coding assistant powered by self-hosted llama.cpp endpoint.☆183Updated 5 months ago
- Benchmark your local LLMs.☆49Updated 10 months ago
- A simple, lightweight shell script to interact with OpenAI or Ollama or Mistral AI or LocalAI or ZhipuAI from the terminal, and enhancing…☆100Updated 5 months ago
- Download manager for Ollama☆29Updated 7 months ago
- Mistral API Client in Golang☆90Updated last year
- The Fastest LLM Gateway with built in OTel observability and MCP gateway☆210Updated this week
- Your gateway to both Ollama & Apple MlX models☆140Updated 4 months ago
- A portable terminal AI interface☆49Updated 3 weeks ago
- Turn natual language into commands. Your CLI tasks, now as easy as a conversation. Run it 100% offline, or use OpenAI's models.☆59Updated last year
- structured outputs for llms☆162Updated 2 weeks ago
- Overide (pronounced over·ide) is a lightweight, yet powerful CLI tool that seamlessly integrates AI-powered code generation into your dev…☆181Updated last month
- 🏗️ Fine-tune, build, and deploy open-source LLMs easily!☆459Updated this week
- Nginx proxy server in a Docker container to Authenticate & Proxy requests to Ollama from Public Internet via Cloudflare Tunnel☆123Updated 2 weeks ago
- IA-powered Ollama Modelfile Generator☆26Updated last year
- Ollama Shell Helper (osh) : English to Unix-like Shell Commands translation using Local LLMs with Ollama☆39Updated last year
- Open source alternative to Perplexity AI with ability to run locally☆213Updated 9 months ago
- "primitive" RAG-like web search model context protocol (MCP) server that runs locally. ✨ no APIs ✨☆57Updated last month
- Web UI for working with large language models☆34Updated last year
- LLM plugin providing access to models running on an Ollama server☆322Updated last week
- ☆18Updated 2 months ago
- LLMX; Easiest 3rd party Local LLM UI for the web!☆259Updated last week
- a magical LLM desktop client that makes it easy for *anyone* to use LLMs and MCP☆324Updated this week
- Turns devices into a scalable LLM platform☆150Updated this week