yankeexe / ollama-managerLinks
π¦ Manage Ollama models from your CLI!
β16Updated 2 months ago
Alternatives and similar repositories for ollama-manager
Users that are interested in ollama-manager are comparing it to the libraries listed below
Sorting:
- π A collection of LLM inference providers and modelsβ432Updated this week
- Gollama: Your offline conversational AI companion. An interactive tool for generating creative responses from various models, right in yoβ¦β152Updated 10 months ago
- Access Gemini LLMs from the command-lineβ147Updated 4 months ago
- An example extension in go using function calling and confirmation dialogsβ62Updated 4 months ago
- A Model Context Protocol (MCP) client library and debugging toolkit in Rust. This foundation provides both a production-ready SDK for buiβ¦β100Updated last month
- All in one CLI tool for the database | Migration, Studio, LSPβ172Updated 3 weeks ago
- "primitive" RAG-like web search model context protocol (MCP) server that runs locally. β¨ no APIs β¨β89Updated last week
- This repository contains a collection of community-maintained Model Context Protocol (MCP) servers. All servers are automatically listed β¦β64Updated 11 months ago
- This is a simple demonstration to show how to keep an LLM loaded for prolonged time in the memory or unloading the model immediately afteβ¦β12Updated last year
- Mystical terminal user interface primitives πβ223Updated this week
- Ghost π» is an experimental CLI that uses AI to generate GitHub Actions workflows, using OpenAIβ98Updated 2 years ago
- Official Go implementation of the UTCPβ78Updated this week
- BubbleTea Components for Ollamaβ25Updated 11 months ago
- A CLI application for generating simple project boilerplates using generative AI.β50Updated last year
- An MCP server to wrap ripgrepβ41Updated 6 months ago
- β Interactive CLI tool for selecting and bundling code into a single, LLM-ready output fileβ83Updated 5 months ago
- AI API server for common use cases β supports multiple models and providers. Run locally with Ollama or LM Studio, or in the cloud via Opβ¦β104Updated last week
- Station is our open-source runtime that lets teams deploy agents on their own infrastructure with full control.β345Updated this week
- Multi-Node LLM Processing Frameworkβ21Updated last year
- Write YAML, execute Agent Workflowsβ289Updated 2 weeks ago
- π©βπ» MCP server to index external repositoriesβ108Updated this week
- Download manager for Ollamaβ30Updated 11 months ago
- β64Updated 11 months ago
- Use perplexity.ai from the cli.β16Updated 9 months ago
- Fabulous, community-maintained bubbles for the https://github.com/charmbracelet/bubbletea libraryβ97Updated 6 months ago
- Crawls through a source code repository and generates a single, flattened text file containing all the code files in a clean, structured β¦β12Updated 7 months ago
- A proxy sidecar to access Gemini models via OpenAI and Ollama APIsβ157Updated 2 months ago
- a web logging proxy for MCP client-server communicationβ26Updated 3 months ago
- A dev container with ollama and ollama examples with the Python OpenAI SDKβ61Updated last year
- β19Updated 6 months ago