Use Codestral Mamba with Visual Studio Code and the Continue extension. A local LLM alternative to GitHub Copilot.
☆29Jul 18, 2024Updated last year
Alternatives and similar repositories for codestral-mamba-for-vscode
Users that are interested in codestral-mamba-for-vscode are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- A simple no-install web UI for Ollama and OAI-Compatible APIs!☆31Jan 30, 2025Updated last year
- ☆10Apr 4, 2023Updated 3 years ago
- A repository to store helpful information and emerging insights in regard to LLMs☆21Oct 27, 2023Updated 2 years ago
- entropix style sampling + GUI☆27Oct 30, 2024Updated last year
- A simple frontend page to interact with an OpenAI like API☆16Jan 31, 2025Updated last year
- Deploy open-source AI quickly and easily - Bonus Offer • AdRunpod Hub is built for open source. One-click deployment and autoscaling endpoints without provisioning your own infrastructure.
- 33B Chinese LLM, DPO QLORA, 100K context, AirLLM 70B inference with single 4GB GPU☆13May 5, 2024Updated last year
- a character-ai like UI for LLM☆10Dec 3, 2024Updated last year
- This repo provides a simple Gradio UI to run Qwen2 VL 72B AWQ in venv and have both image and video inferencing work.☆33Oct 3, 2024Updated last year
- Copy a bunch of files into your clipboard to provide context for LLMs☆113Feb 8, 2026Updated 2 months ago
- llmon-py is a multimodal webui for Llama 3-8B.☆16Jul 1, 2024Updated last year
- Self-contained Python lib with zero-dependencies that give you a unified device properties for gpu, cpu, and npu. No more calling separat…☆14Mar 30, 2026Updated 2 weeks ago
- A bot that checks your grammar and phrasing using LLM of choice☆32Feb 6, 2025Updated last year
- Working folder where I will experiment with using ktranslate to populate Grafana dashboards☆13Jan 20, 2026Updated 2 months ago
- Prettygood DSP: A tiny Arduino based audio DSP board☆12Apr 18, 2022Updated 3 years ago
- Serverless GPU API endpoints on Runpod - Bonus Credits • AdSkip the infrastructure headaches. Auto-scaling, pay-as-you-go, no-ops approach lets you focus on innovating your application.
- ☆14May 20, 2022Updated 3 years ago
- ☆17Dec 16, 2024Updated last year
- ☆18Oct 3, 2024Updated last year
- ☆24Jan 22, 2025Updated last year
- The LLM-powered function builder for TypeScript☆23Aug 30, 2024Updated last year
- Next.js RAG with PGVector☆32Nov 30, 2024Updated last year
- AutoREADME is an AI-powered tool that genereates a README file for any given input repository.☆33Oct 2, 2024Updated last year
- Tcurtsni: Reverse Instruction Chat, ever wonder what your LLM wants to ask you?☆23Jun 25, 2024Updated last year
- My version of an LLM Websearch Agent using a local SearXNG server because SearXNG is great.☆45Jan 27, 2026Updated 2 months ago
- Wordpress hosting with auto-scaling - Free Trial • AdFully Managed hosting for WordPress and WooCommerce businesses that need reliable, auto-scalable performance. Cloudways SafeUpdates now available.
- Open source LLM UI, compatible with all local LLM providers.☆177Sep 20, 2024Updated last year
- Write markdown documentation and generate mermaid diagrams with ease - Made with Hanko, SolidJS and Supabase.☆16Nov 4, 2023Updated 2 years ago
- Blue-text Bot AI. Uses Ollama + AppleScript☆50May 19, 2024Updated last year
- ☆54May 28, 2025Updated 10 months ago
- Experience the power of AI with this free AI voice generator demo. Utilizing Deepgram and Groq, we transform text into voice seamlessly. …☆37Jun 12, 2024Updated last year
- oobabooga extension - Experimental sampler to make LLMs more creative☆23Aug 2, 2023Updated 2 years ago
- DDEV Addon to install the PHP-SPX performance package☆13Oct 8, 2024Updated last year
- ☆167Jun 22, 2025Updated 9 months ago
- The hearth of The Pulsar App, fast, secure and shared inference with modern UI☆59Dec 1, 2024Updated last year
- Serverless GPU API endpoints on Runpod - Bonus Credits • AdSkip the infrastructure headaches. Auto-scaling, pay-as-you-go, no-ops approach lets you focus on innovating your application.
- Simple Tool Caller for llama.cpp☆11Aug 12, 2024Updated last year
- ☆94Mar 28, 2026Updated 2 weeks ago
- USB To Amiga Mouse V2☆12May 22, 2019Updated 6 years ago
- ☆12Apr 6, 2026Updated last week
- Serving LLMs in the HF-Transformers format via a PyFlask API☆72Sep 10, 2024Updated last year
- Python package wrapping llama.cpp for on-device LLM inference☆103Apr 2, 2026Updated 2 weeks ago
- A shell script game where you kill random processes on your computer, the more you kill the higher your score!☆17Sep 27, 2023Updated 2 years ago