amirrezaDev1378 / ollama-model-direct-downloadLinks
Ollama model direct link generator and installer.
☆220Updated 8 months ago
Alternatives and similar repositories for ollama-model-direct-download
Users that are interested in ollama-model-direct-download are comparing it to the libraries listed below
Sorting:
- LM Studio localization☆155Updated last week
- ☆124Updated 11 months ago
- A proxy server for multiple ollama instances with Key security☆515Updated 2 weeks ago
- Lightweight, standalone, multi-platform, and privacy focused local LLM chat interface with optional encryption☆146Updated 6 months ago
- Export and Backup Ollama models into GGUF and ModelFile☆84Updated last year
- Ollama chat client in Vue, everything you need to do your private text rpg in browser☆134Updated last year
- ☆105Updated last month
- ☆104Updated 2 months ago
- LM inference server implementation based on *.cpp.☆286Updated 2 months ago
- ☆93Updated 3 months ago
- Privacy-first agentic framework with powerful reasoning & task automation capabilities. Natively distributed and fully ISO 27XXX complian…☆66Updated 6 months ago
- A LibreOffice Writer extension that adds local-inference generative AI features.☆142Updated 2 months ago
- Tool to download models from Huggingface Hub and convert them to GGML/GGUF for llama.cpp☆160Updated 6 months ago
- Review/Check GGUF files and estimate the memory usage and maximum tokens per second.☆211Updated 2 months ago
- A single-file tkinter-based Ollama GUI project with no external dependencies.☆222Updated 7 months ago
- This extension enhances the capabilities of textgen-webui by integrating advanced vision models, allowing users to have contextualized co…☆57Updated last year
- A text-to-speech and speech-to-text server compatible with the OpenAI API, supporting Whisper, FunASR, Bark, and CosyVoice backends.☆165Updated 3 months ago
- Convert your PDFs and EPUBs into audiobooks effortlessly. Features intelligent text extraction, customizable text-to-speech settings, and…☆131Updated 7 months ago
- Polyglot is a fast, elegant, and free translation tool using AI.☆63Updated last year
- A high-throughput and memory-efficient inference and serving engine for LLMs (Windows build & kernels)☆195Updated last week
- Easily access your Ollama models within LMStudio☆121Updated last year
- VSCode AI coding assistant powered by self-hosted llama.cpp endpoint.☆183Updated 9 months ago
- Cortex.Tensorrt-LLM is a C++ inference library that can be loaded by any server at runtime. It submodules NVIDIA’s TensorRT-LLM for GPU a…☆42Updated last year
- RetroChat is a powerful command-line interface for interacting with various AI language models. It provides a seamless experience for eng…☆81Updated 3 months ago
- ☆41Updated 8 months ago
- Download models from the Ollama library, without Ollama☆104Updated 11 months ago
- Webinterface for administrating Ollama and model Quantization with public endpoints and automized OPENAI proxy☆49Updated 7 months ago
- Simple go utility to download HuggingFace Models and Datasets☆749Updated last month
- automatically quant GGUF models☆214Updated last week
- Open Source Local Data Analysis Assistant.☆42Updated last week