irreg / native_tool_call_adapterLinks
Helps agents work more efficiently by translating cline/Roo-Code tool calls into native tool calls in the API
☆55Updated 2 months ago
Alternatives and similar repositories for native_tool_call_adapter
Users that are interested in native_tool_call_adapter are comparing it to the libraries listed below
Sorting:
- Wraps any OpenAI API interface as Responses with MCPs support so it supports Codex. Adding any missing stateful features. Ollama and Vllm…☆138Updated last month
- ☆176Updated 4 months ago
- ☆196Updated 3 months ago
- ☆87Updated 3 weeks ago
- A simple, locally hosted Web Search MCP server for use with Local LLMs☆386Updated 4 months ago
- Autonomous, agentic, creative story writing system that incorporates stored embeddings and Knowledge Graphs.☆82Updated last week
- ☆228Updated 7 months ago
- Qwen Code OpenAI Wrapper with Cloudflare Workers☆51Updated 3 weeks ago
- ☆70Updated 4 months ago
- A local AI companion that uses a collection of free, open source AI models in order to create two virtual companions that will follow you…☆237Updated last month
- Run multiple resource-heavy Large Models (LM) on the same machine with limited amount of VRAM/other resources by exposing them on differe…☆85Updated last week
- ☆42Updated last year
- General Tool-calling API Proxy☆54Updated 4 months ago
- ☆83Updated 9 months ago
- llama-swap + a minimal ollama compatible api☆37Updated last week
- User-friendly AI Interface (Supports Ollama, OpenAI API, ...)☆101Updated 8 months ago
- Llama.cpp runner/swapper and proxy that emulates LMStudio / Ollama backends☆49Updated 3 months ago
- A lightweight UI for chatting with Ollama models. Streaming responses, conversation history, and multi-model support.☆142Updated 9 months ago
- Proxy for OpenAI☆15Updated 3 months ago
- A persistent local memory for AI, LLMs, or Copilot in VS Code.☆178Updated last month
- Fresh builds of llama.cpp with AMD ROCm™ 7 acceleration☆139Updated this week
- A frontend for creative writing with LLMs☆140Updated last year
- the AI IDE for work, research, development, and play.☆213Updated this week
- ☆50Updated 2 months ago
- Copilot Proxy is a Visual Studio Code extension that exposes the VS Code Language Model API via an Express server. This experimental exte…☆99Updated 9 months ago
- MCP server paired with a browser extension that enables AI agents to control the user's browser.☆212Updated 3 months ago
- Inference engine for Intel devices. Serve LLMs, VLMs, Whisper, Kokoro-TTS, Embedding and Rerank models over OpenAI endpoints.☆260Updated 2 weeks ago
- This is a cross-platform desktop application that allows you to chat with locally hosted LLMs and enjoy features like MCP support☆226Updated 4 months ago
- Enhancing LLMs with LoRA☆193Updated last month
- *NIX SHELL with Local AI/LLM integration☆24Updated 9 months ago