RayFernando1337 / LLM-CalcLinks
Instantly calculate the maximum size of quantized language models that can fit in your available RAM, helping you optimize your models for inference.
☆243Updated 8 months ago
Alternatives and similar repositories for LLM-Calc
Users that are interested in LLM-Calc are comparing it to the libraries listed below
Sorting:
- Local Groq Desktop chat app with MCP support☆381Updated this week
- ☆191Updated last year
- The Open Deep Research app – generate reports with OSS LLMs☆314Updated 3 weeks ago
- Hallucination Detector is a free and open-source tool that helps you verify the accuracy of your LLM generated content instantly.☆305Updated last month
- Provider-agnostic, open-source evaluation infrastructure for language models☆705Updated 2 weeks ago
- Examples on how to use various LLM providers with a Wine Classification problem☆131Updated 2 months ago
- A powerful Python tool for performing technical searches using the Perplexity API, optimized for retrieving precise facts, code examples,…☆210Updated 11 months ago
- FastMLX is a high performance production ready API to host MLX models.☆339Updated 9 months ago
- ☆136Updated 11 months ago
- ☆78Updated last year
- Pipecat voice AI agents running locally on macOS☆300Updated 4 months ago
- A simple Python program to implement the search-extract-summarize flow.☆275Updated 6 months ago