alexziskind1 / llm-inference-calculatorLinks
☆148Updated 4 months ago
Alternatives and similar repositories for llm-inference-calculator
Users that are interested in llm-inference-calculator are comparing it to the libraries listed below
Sorting:
- Instantly calculate the maximum size of quantized language models that can fit in your available RAM, helping you optimize your models fo…☆249Updated 9 months ago
- A cross platform App that gives you the best UX to run models locally or remotely on your own hardware☆72Updated last month
- Link you Ollama models to LM-Studio☆150Updated last year
- Explore the unknown, build the future, own your data.☆234Updated this week
- A lightweight UI for chatting with Ollama models. Streaming responses, conversation history, and multi-model support.☆148Updated 10 months ago
- beep boop 🤖 (experimental)☆118Updated last year
- You don’t need to read the code to understand how to build!☆254Updated 2 weeks ago
- This is a cross-platform desktop application that allows you to chat with locally hosted LLMs and enjoy features like MCP support☆226Updated 5 months ago
- Your gateway to both Ollama & Apple MlX models☆149Updated 10 months ago
- Agent MCP for ffmpeg☆214Updated 7 months ago
- This project was generated 100% by AI, with one prompt. NOTE: This neuroca project was generated in 3 hours on 3/3/2025. There are depend…