SearchSavior / OpenArcLinks
Lightweight Inference server for OpenVINO
☆176Updated last week
Alternatives and similar repositories for OpenArc
Users that are interested in OpenArc are comparing it to the libraries listed below
Sorting:
- InferX is a Inference Function as a Service Platform☆105Updated this week
- ☆71Updated last week
- Turns devices into a scalable LLM platform☆134Updated last week
- Easy to use interface for the Whisper model optimized for all GPUs!☆212Updated last week
- ☆76Updated 3 months ago
- llama.cpp fork with additional SOTA quants and improved performance☆519Updated this week
- MAESTRO is an AI-powered research application designed to streamline complex research tasks.☆154Updated 2 weeks ago
- Collection of LLM system prompts.☆33Updated last week
- ☆97Updated 3 weeks ago
- Cohere Toolkit is a collection of prebuilt components enabling users to quickly build and deploy RAG applications.☆28Updated 4 months ago
- ☆120Updated last week
- klmbr - a prompt pre-processing technique to break through the barrier of entropy while generating text with LLMs☆72Updated 8 months ago
- Dia-JAX: A JAX port of Dia, the text-to-speech model for generating realistic dialogue from text with emotion and tone control.☆27Updated 3 weeks ago
- ☆90Updated 5 months ago
- GPU Power and Performance Manager☆58Updated 7 months ago
- Open source LLM UI, compatible with all local LLM providers.☆174Updated 8 months ago
- Run multiple resource-heavy Large Models (LM) on the same machine with limited amount of VRAM/other resources by exposing them on differe…☆62Updated this week
- An optimized quantization and inference library for running LLMs locally on modern consumer-class GPUs☆375Updated last week
- No-code CLI designed for accelerating ONNX workflows☆192Updated 2 weeks ago
- ☆202Updated 2 weeks ago
- Comparison of the output quality of quantization methods, using Llama 3, transformers, GGUF, EXL2.☆153Updated last year
- LLM Inference on consumer devices☆115Updated 2 months ago
- A sleek web interface for Ollama, making local LLM management and usage simple. WebOllama provides an intuitive UI to manage Ollama model…☆48Updated last week
- Privacy-first agentic framework with powerful reasoning & task automation capabilities. Natively distributed and fully ISO 27XXX complian…☆65Updated 2 months ago
- ☆197Updated 3 weeks ago
- Local LLM Powered Recursive Search & Smart Knowledge Explorer☆242Updated 3 months ago
- 🔥 LitLytics - an affordable, simple analytics platform that leverages LLMs to automate data analysis☆99Updated 6 months ago
- Model swapping for llama.cpp (or any local OpenAPI compatible server)☆848Updated this week
- Minimal Linux OS with a Model Context Protocol (MCP) gateway to expose local capabilities to LLMs.☆228Updated last week
- automatically quant GGUF models☆179Updated this week