amirrezaDev1378 / ollama-model-direct-downloadLinks
Ollama model direct link generator and installer.
☆226Updated 10 months ago
Alternatives and similar repositories for ollama-model-direct-download
Users that are interested in ollama-model-direct-download are comparing it to the libraries listed below
Sorting:
- Export and Backup Ollama models into GGUF and ModelFile☆89Updated last year
- Download models from the Ollama library, without Ollama☆117Updated last year
- A proxy server for multiple ollama instances with Key security☆549Updated last month
- Ollama chat client in Vue, everything you need to do your private text rpg in browser☆135Updated last year
- Tool to download models from Huggingface Hub and convert them to GGML/GGUF for llama.cpp☆166Updated 7 months ago
- ☆109Updated 4 months ago
- LLMX; Easiest 3rd party Local LLM UI for the web!☆284Updated last month
- ☆127Updated last year
- LM inference server implementation based on *.cpp.☆294Updated last month
- Web UI for Ollama, OpenAI, Anthropic, Google, Deepseek, OpenRouter, Mistral.ai, Together.ai, Groq.com☆83Updated 7 months ago
- Review/Check GGUF files and estimate the memory usage and maximum tokens per second.☆221Updated 4 months ago
- VSCode AI coding assistant powered by self-hosted llama.cpp endpoint.☆183Updated 10 months ago
- Lightweight, standalone, multi-platform, and privacy focused local LLM chat interface with optional encryption☆152Updated 8 months ago
- automatically quant GGUF models☆219Updated 2 months ago
- Easily access your Ollama models within LMStudio☆126Updated last year
- A open webui function for better R1 experience☆78Updated 9 months ago
- Easy to use interface for the Whisper model optimized for all GPUs!☆405Updated 4 months ago
- LLM Benchmark for Throughput via Ollama (Local LLMs)☆319Updated 2 weeks ago
- plug whisper audio transcription to a local ollama server and ouput tts audio responses☆362Updated 2 months ago
- ☆94Updated 5 months ago
- Fully-featured, beautiful web interface for vLLM - built with NextJS.☆165Updated last week
- A text-to-speech and speech-to-text server compatible with the OpenAI API, supporting Whisper, FunASR, Bark, and CosyVoice backends.☆184Updated 3 weeks ago
- Aiming to provide a seamless and privacy driven chatting experience with open-sourced technologies(Ollama), particularly open sourced LLM…☆169Updated 2 months ago
- Nginx proxy server in a Docker container to Authenticate & Proxy requests to Ollama from Public Internet via Cloudflare Tunnel☆154Updated 3 months ago
- Run LLMs on AMD Ryzen™ AI NPUs in minutes. Just like Ollama - but purpose-built and deeply optimized for the AMD NPUs.☆560Updated this week
- Polyglot is a fast, elegant, and free translation tool using AI.☆64Updated last month
- Simple go utility to download HuggingFace Models and Datasets☆781Updated 3 months ago
- Belullama is a comprehensive AI application that bundles Ollama, Open WebUI, and Automatic1111 (Stable Diffusion WebUI) into a single, ea…☆192Updated 6 months ago
- Croco.Cpp is fork of KoboldCPP infering GGML/GGUF models on CPU/Cuda with KoboldAI's UI. It's powered partly by IK_LLama.cpp, and compati…☆154Updated this week
- AI Studio is an independent app for utilizing LLMs.☆362Updated last week