10Nates / ollama-autocoder
A simple to use Ollama autocompletion engine with options exposed and streaming functionality
☆111Updated 3 months ago
Alternatives and similar repositories for ollama-autocoder:
Users that are interested in ollama-autocoder are comparing it to the libraries listed below
- beep boop 🤖☆72Updated 3 weeks ago
- Dagger functions to import Hugging Face GGUF models into a local ollama instance and optionally push them to ollama.com.☆114Updated 8 months ago
- Midori AI's Mono Repo! Check out our site below!☆115Updated this week
- A GUI interface for Ollama☆313Updated 3 months ago
- Lightweight, standalone, multi-platform, and privacy focused local LLM chat interface with optional encryption☆88Updated 2 months ago
- An open-source VSCode extension, the AI coding assistant, integrates with Ollama, HuggingFace, OpenAI, and Anthropic.☆174Updated this week
- Examples of integrating the OpenRouter API☆111Updated last year
- AI Studio is an independent app for utilizing LLM.☆200Updated last week
- Autogen studio docker☆76Updated 3 weeks ago
- 100% Local AGI with LocalAI☆440Updated 7 months ago
- Rivet plugin for integration with Ollama, the tool for running LLMs locally easily☆35Updated 9 months ago
- a Repository of Open-WebUI tools to use with your favourite LLMs☆103Updated 2 weeks ago
- WIP: Open WebUI Chrome Extension (Requires Open WebUI v0.2.0+)☆89Updated 8 months ago
- A Function Calls Proxy for Groq, the fastest AI alive!☆183Updated 10 months ago
- Corrective RAG demo powerd by Ollama☆84Updated 9 months ago
- WIP: Open WebUI desktop application, based on Electron.☆175Updated last week
- LLM Benchmark for Throughput via Ollama (Local LLMs)☆162Updated last week
- Create Linux commands from natural language, in the shell.☆105Updated 3 months ago
- Ollama chat client in Vue, everything you need to do your private text rpg in browser☆110Updated 3 months ago
- Efficient visual programming for AI language models☆339Updated 4 months ago
- A proxy server for multiple ollama instances with Key security☆318Updated 3 weeks ago
- A front-end for selfhosted LLMs based on the LocalAI API☆68Updated 9 months ago
- Parse files (e.g. code repos) and websites to clipboard or a file for ingestions by AI / LLMs☆134Updated last month
- LLMX; Easiest 3rd party Local LLM UI for the web!☆202Updated last month
- Easily access your Ollama models within LMStudio☆71Updated 9 months ago
- Extension for VSCode for Ollama☆21Updated last year
- API up your Ollama Server.☆123Updated last month
- MinimalChat is a lightweight, open-source chat application that allows you to interact with various large language models.☆154Updated 6 months ago
- Download models from the Ollama library, without Ollama☆49Updated 2 months ago
- An example of running local models with GGML☆39Updated last year