mattcurf / ollama-intel-gpuLinks
☆253Updated 2 months ago
Alternatives and similar repositories for ollama-intel-gpu
Users that are interested in ollama-intel-gpu are comparing it to the libraries listed below
Sorting:
- Make use of Intel Arc Series GPU to Run Ollama, StableDiffusion, Whisper and Open WebUI, for image generation, speech recognition and int…☆118Updated 2 months ago
- ☆231Updated last year
- This is a step-by-step guide to enable Gen 12/13 Intel vGPU using SR-IOV Technology so up to 7 Client VMs can enjoy hardware GPU decoding☆266Updated 9 months ago
- Is a comprehensive and versatile Bash script designed to simplify and optimize the configuration and management of Proxmox Virtual Enviro…☆518Updated 2 weeks ago
- AMD APU compatible Ollama. Get up and running with Llama 3.3, DeepSeek-R1, Phi-4, Gemma 3, Mistral Small 3.1 and other large language mod…☆85Updated this week
- This is a one-click install script to enable Gen 12/13 Intel vGPU using SR-IOV Technology so up to 7 Client VMs can enjoy hardware GPU de…☆73Updated 9 months ago
- Lightweight Inference server for OpenVINO☆202Updated this week
- Automatically scale LXC containers resources on Proxmox hosts☆209Updated 5 months ago
- ☆53Updated last year
- Open WebUI Client for Android is a mobile app for using Open WebUI interfaces with local or remote AI models.☆103Updated last month
- Linux distro for AI computers. Go from bare-metal GPUs to running AI workloads - like vLLM, SGLang, RAG, and Agents - in minutes, fully a…☆240Updated last week
- A simple GUI for configuring traefik routes☆112Updated 2 months ago
- A script that automatically activates ASPM for all supported devices on Linux☆330Updated 8 months ago
- Run LLMs on AMD Ryzen™ AI NPUs. Just like Ollama - but purpose-built and deeply optimized for the AMD NPUs.☆148Updated this week
- Persistent Linux 'jails' on TrueNAS SCALE to install software (k3s, docker, portainer, podman, etc.) with full access to all files via bi…☆587Updated 9 months ago
- API up your Ollama Server.☆173Updated 2 months ago
- Native mobile client for Open‑WebUI. Chat with your self‑hosted AI.☆309Updated this week
- Script for building Proxmox Backup Server 3.x (Bookworm) or 4.x (Trixie) for Armbian64☆275Updated this week
- A drop-in replacement for portainer/portainer-ce, without annoying UI elements or tracking script☆165Updated last year
- Chanakya is an advanced, open-source, and self-hostable voice assistant designed for privacy, power, and flexibility. It leverages local …☆147Updated last week
- LLM Benchmark for Throughput via Ollama (Local LLMs)☆280Updated 2 weeks ago
- A tunneling client for Pangolin☆453Updated this week
- Ollama with intel (i)GPU acceleration in docker and benchmark☆19Updated this week
- Automatically scale virtual machines resources on Proxmox hosts☆258Updated last month
- Build your own distroless images with this mini file system and some binaries☆43Updated this week
- Belullama is a comprehensive AI application that bundles Ollama, Open WebUI, and Automatic1111 (Stable Diffusion WebUI) into a single, ea…☆183Updated 2 months ago
- LXD Graphical Web Console☆354Updated 3 months ago
- Caddy Docker custom images built with different combinations of modules. All images are updated automatically when a new version of Caddy…☆219Updated 2 weeks ago
- A simple WireGuard interface management server written in Go☆158Updated this week
- Search the web and your self-hosted apps using local AI agents.☆460Updated 9 months ago