amirrezaDev1378 / ollama-model-direct-downloadLinks
Ollama model direct link generator and installer.
☆226Updated 10 months ago
Alternatives and similar repositories for ollama-model-direct-download
Users that are interested in ollama-model-direct-download are comparing it to the libraries listed below
Sorting:
- Ollama chat client in Vue, everything you need to do your private text rpg in browser☆135Updated last year
- Aiming to provide a seamless and privacy driven chatting experience with open-sourced technologies(Ollama), particularly open sourced LLM…☆169Updated 2 months ago
- A proxy server for multiple ollama instances with Key security☆551Updated last month
- Export and Backup Ollama models into GGUF and ModelFile☆89Updated last year
- Lightweight, standalone, multi-platform, and privacy focused local LLM chat interface with optional encryption☆153Updated 8 months ago
- ☆109Updated 4 months ago
- LLMX; Easiest 3rd party Local LLM UI for the web!☆284Updated last month
- Generate a llama-quantize command to copy the quantization parameters of any GGUF☆28Updated 4 months ago
- Download models from the Ollama library, without Ollama☆118Updated last year
- Nginx proxy server in a Docker container to Authenticate & Proxy requests to Ollama from Public Internet via Cloudflare Tunnel☆154Updated 3 months ago
- Link you Ollama models to LM-Studio☆150Updated last year
- VSCode AI coding assistant powered by self-hosted llama.cpp endpoint.☆183Updated 11 months ago
- AI Studio is an independent app for utilizing LLMs.☆362Updated this week
- A single-file tkinter-based Ollama GUI project with no external dependencies.☆238Updated last month
- Inference engine for Intel devices. Serve LLMs, VLMs, Whisper, Kokoro-TTS, Embedding and Rerank models over OpenAI endpoints.☆267Updated this week
- ☆127Updated last year
- A minimal LLM chat app that runs entirely in your browser☆1,057Updated 2 months ago
- LM inference server implementation based on *.cpp.☆294Updated last month
- Easily access your Ollama models within LMStudio☆127Updated last year
- Command-line personal assistant using your favorite proprietary or local models with access to over 30+ tools☆112Updated 6 months ago
- ☆108Updated 2 weeks ago
- LM Studio localization☆167Updated this week
- Tool to download models from Huggingface Hub and convert them to GGML/GGUF for llama.cpp☆166Updated 8 months ago
- A open webui function for better R1 experience☆78Updated 9 months ago
- MCP server for connecting agentic systems to search systems via searXNG☆108Updated 10 months ago
- Polyglot is a fast, elegant, and free translation tool using AI.☆64Updated last month
- Review/Check GGUF files and estimate the memory usage and maximum tokens per second.☆221Updated 4 months ago
- ☆94Updated 5 months ago
- Easy to use interface for the Whisper model optimized for all GPUs!☆406Updated this week
- ☆210Updated 3 months ago