twinnydotdev / twinnyLinks
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but 100% free.
☆3,619Updated 4 months ago
Alternatives and similar repositories for twinny
Users that are interested in twinny are comparing it to the libraries listed below
Sorting:
- Replace Copilot local AI☆2,073Updated last year
- Type less, code more: Cody is an AI code assistant that uses advanced search and codebase context to help you write and fix code.☆3,792Updated 4 months ago
- AI Agent that handles engineering tasks end-to-end: integrates with developers’ tools, plans, executes, and iterates until it achieves a …☆3,405Updated this week
- Local AI API Platform☆2,763Updated 5 months ago
- An open-source alternative to GitHub copilot that runs locally.☆987Updated last year
- Are Copilots Local Yet? The frontier of local LLM Copilots for code completion, project generation, shell assistance, and more. Find tool…☆579Updated 10 months ago
- LLocalSearch is a completely locally running search aggregator using LLM Agents. The user can ask a question and the system will use a ch…☆5,956Updated 7 months ago
- Fully private LLM chatbot that runs entirely with a browser with no server needed. Supports Mistral and LLama 3.☆2,675Updated last year
- the terminal client for Ollama☆2,283Updated last month
- Simple HTML UI for Ollama☆1,100Updated 3 months ago
- Local CLI Copilot, powered by Ollama. 💻🦙☆1,460Updated last month
- LSP-AI is an open-source language server that serves as a backend for AI-powered functionality, designed to assist and empower software e…☆3,060Updated 11 months ago
- Proxy that allows you to use ollama as a copilot like Github copilot☆797Updated 3 months ago
- Use Code Llama with Visual Studio Code and the Continue extension. A local LLM alternative to GitHub Copilot.☆570Updated last year
- Bionic is an on-premise replacement for ChatGPT, offering the advantages of Generative AI while maintaining strict data confidentiality☆2,275Updated this week
- LLM powered development for VSCode☆1,313Updated last year
- 👀 What LLM to use?☆645Updated last year
- Reliable model swapping for any local OpenAI/Anthropic compatible server - llama.cpp, vllm, etc☆1,977Updated last week
- Effortlessly run LLM backends, APIs, frontends, and services with one command.☆2,172Updated this week
- Chat with your documents using local AI☆1,074Updated last year
- ☆3,379Updated last year
- ✨ Fully autonomous AI Agent that can perform complicated tasks and projects using terminal, browser, and editor.☆2,403Updated last year
- Your agent in your terminal, equipped with local tools: writes code, uses the terminal, browses the web, vision.☆4,074Updated this week
- Home of StarCoder2!☆1,998Updated last year
- Distributed LLM inference. Connect home devices into a powerful cluster to accelerate LLM inference. More devices means faster inference.☆2,755Updated last month
- Private & local AI personal knowledge management app for high entropy people.☆8,407Updated 6 months ago
- ⏩ Ship faster with Continuous AI. Open-source CLI that can be used in TUI mode as a coding agent or Headless mode to run background agent…☆30,177Updated this week
- 🔍 AI search engine - self-host with local or cloud LLMs☆3,487Updated last year
- A fast inference library for running LLMs locally on modern consumer-class GPUs☆4,379Updated 3 months ago
- Pipelines: Versatile, UI-Agnostic OpenAI-Compatible Plugin Framework☆2,204Updated 3 months ago