mattcurf / ollama-intel-gpuLinks
☆257Updated 6 months ago
Alternatives and similar repositories for ollama-intel-gpu
Users that are interested in ollama-intel-gpu are comparing it to the libraries listed below
Sorting:
- Make use of Intel Arc Series GPU to Run Ollama, StableDiffusion, Whisper and Open WebUI, for image generation, speech recognition and int…☆217Updated 3 weeks ago
- Inference engine for Intel devices. Serve LLMs, VLMs, Whisper, Kokoro-TTS, Embedding and Rerank models over OpenAI endpoints.☆267Updated this week
- AMD APU compatible Ollama. Get up and running with OpenAI gpt-oss, DeepSeek-R1, Gemma 3 and other models.☆138Updated last week
- This is a step-by-step guide to enable Gen 12/13 Intel vGPU using SR-IOV Technology so up to 7 Client VMs can enjoy hardware GPU decoding☆308Updated last year
- Run LLMs on AMD Ryzen™ AI NPUs in minutes. Just like Ollama - but purpose-built and deeply optimized for the AMD NPUs.☆572Updated this week
- ☆695Updated last week
- ☆245Updated last year
- Open WebUI Client for Android is a mobile app for using Open WebUI interfaces with local or remote AI models.☆134Updated 5 months ago
- A daemon that automatically manages the performance states of NVIDIA GPUs.☆104Updated last month
- This is a one-click install script to enable Gen 12/13 Intel vGPU using SR-IOV Technology so up to 7 Client VMs can enjoy hardware GPU de…☆77Updated last year
- Ollama with intel (i)GPU acceleration in docker and benchmark☆29Updated last week
- A script that automatically activates ASPM for all supported devices on Linux☆421Updated 3 months ago
- Is a comprehensive and versatile Bash script designed to simplify and optimize the configuration and management of Proxmox Virtual Enviro…☆753Updated 3 weeks ago
- ☆173Updated 2 months ago
- ☆205Updated this week
- Powerful search page powered by LLMs and SearXNG☆264Updated 2 months ago
- Interactive, locally hosted tool to migrate Open-WebUI SQLite databases to PostgreSQL☆185Updated 2 months ago
- Native mobile client for OpenWebUI. Chat with your self‑hosted AI.☆830Updated last week
- Build your own distroless images with this mini file system and some binaries☆59Updated last month
- AMD (Radeon GPU) ROCm based setup for popular AI tools on Ubuntu 24.04.1☆216Updated last month
- API up your Ollama Server.☆190Updated 2 months ago
- Reliable model swapping for any local OpenAI/Anthropic compatible server - llama.cpp, vllm, etc☆2,086Updated last week
- Persistent Linux 'jails' on TrueNAS SCALE to install software (k3s, docker, portainer, podman, etc.) with full access to all files via bi…☆585Updated last year
- Simple AI/LLM benchmarking tools.☆169Updated last week
- OpenAI-Compatible Proxy Middleware for the Wyoming Protocol☆125Updated 3 weeks ago
- cluster proxmox over tailscale☆206Updated 2 months ago
- Web interface for Network UPS Tools☆281Updated last week
- Script for building Proxmox Backup Server 3.x (Bookworm) or 4.x (Trixie) for Armbian64☆326Updated last week
- Web UI and API for managing MCP Orchestrator (mcpo) instances and configurations☆127Updated 7 months ago
- Automatically optimize files uploaded to Immich in order to save storage space☆201Updated 3 weeks ago