amirrezaDev1378 / ollama-model-direct-downloadLinks
Ollama model direct link generator and installer.
☆217Updated 7 months ago
Alternatives and similar repositories for ollama-model-direct-download
Users that are interested in ollama-model-direct-download are comparing it to the libraries listed below
Sorting:
- LM inference server implementation based on *.cpp.☆279Updated last month
- Ollama chat client in Vue, everything you need to do your private text rpg in browser☆134Updated 11 months ago
- ☆122Updated 11 months ago
- A high-throughput and memory-efficient inference and serving engine for LLMs (Windows build & kernels)☆175Updated last week
- Export and Backup Ollama models into GGUF and ModelFile☆82Updated last year
- Download models from the Ollama library, without Ollama☆100Updated 10 months ago
- Review/Check GGUF files and estimate the memory usage and maximum tokens per second.☆208Updated last month
- A text-to-speech and speech-to-text server compatible with the OpenAI API, supporting Whisper, FunASR, Bark, and CosyVoice backends.☆163Updated 2 months ago
- A open webui function for better R1 experience☆78Updated 7 months ago
- Lightweight, standalone, multi-platform, and privacy focused local LLM chat interface with optional encryption☆142Updated 5 months ago
- Open Source Local Data Analysis Assistant.☆41Updated this week
- Aiming to provide a seamless and privacy driven chatting experience with open-sourced technologies(Ollama), particularly open sourced LLM…☆163Updated 3 weeks ago
- ☆105Updated 2 weeks ago
- Polyglot is a fast, elegant, and free translation tool using AI.☆63Updated last year
- A proxy server for multiple ollama instances with Key security☆499Updated 2 weeks ago
- Easy to use interface for the Whisper model optimized for all GPUs!☆320Updated 2 months ago
- AI Studio is an independent app for utilizing LLMs.☆306Updated 2 weeks ago
- A minimal interface for AI Companion that runs entirely in your browser.☆144Updated this week
- Tool to download models from Huggingface Hub and convert them to GGML/GGUF for llama.cpp☆159Updated 5 months ago
- ☆102Updated last month
- LM Studio localization☆153Updated last week
- Easily access your Ollama models within LMStudio☆120Updated last year
- An OpenAI API compatible API for chat with image input and questions about the images. aka Multimodal.☆259Updated 7 months ago
- Golang web client for Ollama, fast and easy to use.☆29Updated 2 months ago
- automatically quant GGUF models☆204Updated last week
- ☆93Updated 3 months ago
- Generate a llama-quantize command to copy the quantization parameters of any GGUF☆24Updated 2 months ago
- LLMX; Easiest 3rd party Local LLM UI for the web!☆270Updated last month
- LLM inference in C/C++☆102Updated last month
- A LibreOffice Writer extension that adds local-inference generative AI features.☆140Updated last month