lmstudio-ai / lmstudio-bug-trackerLinks
Bug tracking for the LM Studio desktop application
☆74Updated 8 months ago
Alternatives and similar repositories for lmstudio-bug-tracker
Users that are interested in lmstudio-bug-tracker are comparing it to the libraries listed below
Sorting:
- LM Studio JSON configuration file format and a collection of example config files.☆205Updated last year
- LM Studio localization☆146Updated this week
- Easily access your Ollama models within LMStudio☆117Updated last year
- AI Studio is an independent app for utilizing LLMs.☆302Updated last week
- Link you Ollama models to LM-Studio☆142Updated last year
- pinokio official documentation☆271Updated 3 months ago
- The default client software to create images for the AI-Horde☆135Updated last month
- Dolphin System Messages☆347Updated 7 months ago
- ☆124Updated last month
- A LibreOffice Writer extension that adds local-inference generative AI features.☆136Updated 2 weeks ago
- Browser extension for AnythingLLM Docker & Desktop application☆143Updated last year
- Python app for LM Studio-enhanced voice conversations with local LLMs. Uses Whisper for speech-to-text and offers a privacy-focused, acce…☆112Updated last year
- WIP: Open WebUI Chrome Extension (Requires Open WebUI v0.2.0+)☆154Updated last year
- LM Studio Python SDK☆629Updated last week
- This repo turns your PC into a AI Horde worker node☆275Updated 8 months ago
- Plugin that lets you ask questions about your documents including audio and video files.☆348Updated this week
- Add web search results to LLM prompts.☆33Updated 3 months ago
- Ollama chat client in Vue, everything you need to do your private text rpg in browser☆133Updated 10 months ago
- Lightweight, standalone, multi-platform, and privacy focused local LLM chat interface with optional encryption☆139Updated 4 months ago
- A zero dependency web UI for any LLM backend, including KoboldCpp, OpenAI and AI Horde☆136Updated this week
- ☆108Updated last year
- LLMX; Easiest 3rd party Local LLM UI for the web!☆272Updated 2 weeks ago
- A Model Control Protocol (MCP) server that allows Claude to communicate with locally running LLM models via LM Studio.☆112Updated 2 months ago
- Tutorial for Pinokio and its Applications☆91Updated last month
- ☆121Updated 10 months ago
- update your ollama models to the latest☆79Updated 6 months ago
- ☆17Updated 10 months ago
- QA-Pilot is an interactive chat project that leverages online/local LLM for rapid understanding and navigation of GitHub code repository.☆306Updated 3 weeks ago
- A simple Gradio WebUI for loading/unloading models and loras in tabbyAPI.☆20Updated 9 months ago
- VSCode AI coding assistant powered by self-hosted llama.cpp endpoint.☆183Updated 7 months ago