alexziskind1 / llm-inference-calculatorLinks
☆118Updated 3 weeks ago
Alternatives and similar repositories for llm-inference-calculator
Users that are interested in llm-inference-calculator are comparing it to the libraries listed below
Sorting:
- Instantly calculate the maximum size of quantized language models that can fit in your available RAM, helping you optimize your models fo…☆228Updated 2 months ago
- Link you Ollama models to LM-Studio☆140Updated last year
- Your gateway to both Ollama & Apple MlX models☆140Updated 4 months ago
- This is a cross-platform desktop application that allows you to chat with locally hosted LLMs and enjoy features like MCP support☆221Updated last month
- Notate is a desktop chat application that takes AI conversations to the next level. It combines the simplicity of chat with advanced feat…☆257Updated 4 months ago
- You don’t need to read the code to understand how to build!☆201Updated 6 months ago
- Agent MCP for ffmpeg☆196Updated last month
- Welcome!☆140Updated 7 months ago
- A cross patform app that unlocks your devices Gen AI capabilities☆60Updated this week
- A Python-based web-assisted large language model (LLM) search assistant using Llama.cpp☆357Updated 8 months ago
- A lightweight UI for chatting with Ollama models. Streaming responses, conversation history, and multi-model support.☆110Updated 4 months ago
- LLM search engine faster than perplexity!☆277Updated this week
- LM Studio Python SDK☆551Updated this week
- 🤖 An open-source AI assistant answering questions using your docs☆175Updated 2 months ago
- A multi-agent AI architecture that connects 25+ specialized agents through n8n and MCP servers. Project NOVA routes requests to domain-sp…☆195Updated last month
- AI agents platform that gives you an integrated team of personal assistants that can work behind the scenes to handle daily monotonous ta…☆116Updated this week
- Easily access your Ollama models within LMStudio☆113Updated last year
- AI Studio is an independent app for utilizing LLMs.☆285Updated this week
- Local debugging agent that runs in your terminal☆382Updated last month
- The Fastest Way to Fine-Tune LLMs Locally☆312Updated 3 months ago
- open source assistant using small models (2b - 5b) , with agentic and tool calling capabilities and integration of RAG with effiecient …☆202Updated last month
- mcp manager, Sync config across the clients, one click setup, add & install (Model Context Protocol)MCP server to Claude/Cursor/Windsurf/…☆144Updated this week
- Apple MLX engine for LM Studio☆669Updated this week
- A unified search engine for all your online knowledge → The Invisible Companion for Work + Life☆130Updated last month
- Minimal Linux OS with a Model Context Protocol (MCP) gateway to expose local capabilities to LLMs.☆257Updated 3 weeks ago
- Optimized Ollama LLM server configuration for Mac Studio and other Apple Silicon Macs. Headless setup with automatic startup, resource op…☆205Updated 4 months ago
- Local LLM Powered Recursive Search & Smart Knowledge Explorer☆244Updated 5 months ago
- Ollama client written in Python☆4Updated 7 months ago
- ☆141Updated last week
- ☆186Updated 3 months ago