Run GGUF models easily with a KoboldAI UI. One File. Zero Install.
☆10,323Apr 26, 2026Updated this week
Alternatives and similar repositories for koboldcpp
Users that are interested in koboldcpp are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- LLM Frontend for Power Users.☆26,387Updated this week
- The original local LLM interface. Text, vision, tool-calling, training. UI + API, 100% offline and private.☆46,874Updated this week
- For GGUF support, see KoboldCPP: https://github.com/LostRuins/koboldcpp☆3,880Jan 16, 2025Updated last year
- LLM inference in C/C++☆106,639Updated this week
- AI Inferencing at the Edge. A simple one-file way to run various GGML models with KoboldAI's UI with AMD ROCm offloading☆767Dec 30, 2025Updated 3 months ago
- Deploy on Railway without the complexity - Free Credits Offer • AdConnect your repo and Railway handles the rest with instant previews. Quickly provision container image services, databases, and storage volumes.
- A fast inference library for running LLMs locally on modern consumer-class GPUs