Gaurav-Gosain / gollamaLinks
Gollama: Your offline conversational AI companion. An interactive tool for generating creative responses from various models, right in your terminal. Ideal for brainstorming, creative writing, or seeking inspiration.
☆167Updated last year
Alternatives and similar repositories for gollama
Users that are interested in gollama are comparing it to the libraries listed below
Sorting:
- A battery-included, POSIX-compatible, generative shell☆370Updated 2 weeks ago
- A portable terminal AI interface☆141Updated this week
- TUI for Ollama and other LLM providers☆412Updated this week
- Create Linux commands from natural language, in the shell.☆122Updated 5 months ago
- 🐈 A collection of LLM inference providers and models☆551Updated this week
- Parse files (e.g. code repos) and websites to clipboard or a file for ingestions by AI / LLMs☆361Updated last week
- Manage and use multiple Ollama instances with automatic offline detection/failover and model availability tracking☆94Updated last year
- ☆19Updated 9 months ago
- Link you Ollama models to LM-Studio☆150Updated last year
- Generative UI in your terminal☆405Updated last year
- Turn natual language into commands. Your CLI tasks, now as easy as a conversation. Run it 100% offline, or use OpenAI's models.☆63Updated last year
- A simple, lightweight shell script to interact with OpenAI or Ollama or Mistral AI or LocalAI or ZhipuAI from the terminal, and enhancing…☆116Updated last year
- Benchmark your local LLMs.☆53Updated last year
- Overide (pronounced over·ide) is a lightweight, yet powerful CLI tool that seamlessly integrates AI-powered code generation into your dev…☆192Updated 6 months ago
- Mystical terminal user interface primitives 🌈☆253Updated last week
- structured outputs for llms☆189Updated 3 months ago
- A CLI for piping outputs to ollama or just prompting☆57Updated last year
- Access Gemini LLMs from the command-line☆148Updated 7 months ago
- Mistral API Client in Golang☆97Updated last year
- VSCode AI coding assistant powered by self-hosted llama.cpp endpoint.☆183Updated last year
- A simple Web / UI / App / Frontend to Ollama.☆84Updated 9 months ago
- IA-powered Ollama Modelfile Generator☆25Updated last year
- 100% Local Memory layer and Knowledge base for agents with WebUI☆705Updated this week
- smol-dev-go, a Go implementation of smol developer☆71Updated last year
- A lightweight UI for chatting with Ollama models. Streaming responses, conversation history, and multi-model support.☆148Updated 10 months ago
- LLM plugin providing access to models running on an Ollama server☆351Updated last month
- Ollama Cloud is a Highly Scalable Cloud-native Stack for Ollama☆154Updated last year
- ✋ Interactive CLI tool for selecting and bundling code into a single, LLM-ready output file☆89Updated last month
- Download models from the Ollama library, without Ollama☆121Updated last year
- ☆100Updated 3 months ago