iohub / collamaLinks
VSCode AI coding assistant powered by self-hosted llama.cpp endpoint.
☆182Updated 4 months ago
Alternatives and similar repositories for collama
Users that are interested in collama are comparing it to the libraries listed below
Sorting:
- Dagger functions to import Hugging Face GGUF models into a local ollama instance and optionally push them to ollama.com.☆115Updated last year
- ☆202Updated 2 weeks ago
- Link you Ollama models to LM-Studio☆139Updated 10 months ago
- AI Studio is an independent app for utilizing LLMs.☆268Updated this week
- LM Studio JSON configuration file format and a collection of example config files.☆199Updated 9 months ago
- Something similar to Apple Intelligence?☆60Updated 11 months ago
- Tool to download models from Huggingface Hub and convert them to GGML/GGUF for llama.cpp☆148Updated last month
- This small API downloads and exposes access to NeuML's txtai-wikipedia and full wikipedia datasets, taking in a query and returning full …☆94Updated last month
- Efficient visual programming for AI language models☆362Updated 3 weeks ago
- Review/Check GGUF files and estimate the memory usage and maximum tokens per second.☆173Updated this week
- A fast batching API to serve LLM models☆181Updated last year
- Export and Backup Ollama models into GGUF and ModelFile☆70Updated 8 months ago
- This is the Mixture-of-Agents (MoA) concept, adapted from the original work by TogetherAI. My version is tailored for local model usage a…☆116Updated 11 months ago
- Ollama client written in Python☆2Updated 6 months ago
- Rivet plugin for integration with Ollama, the tool for running LLMs locally easily☆37Updated last year
- automatically quant GGUF models☆179Updated this week
- Your gateway to both Ollama & Apple MlX models☆134Updated 3 months ago
- Web UI for ExLlamaV2☆495Updated 3 months ago
- An AI assistant beyond the chat box.☆329Updated last year
- The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). Allowing users to chat with LLM …☆566Updated 3 months ago
- Open source alternative to Perplexity AI with ability to run locally☆205Updated 7 months ago
- Create Linux commands from natural language, in the shell.☆110Updated 7 months ago
- No-messing-around sh client for llama.cpp's server☆30Updated 9 months ago
- LLMX; Easiest 3rd party Local LLM UI for the web!☆246Updated 2 weeks ago
- An OpenAI API compatible API for chat with image input and questions about the images. aka Multimodal.☆254Updated 2 months ago
- Download models from the Ollama library, without Ollama☆84Updated 6 months ago
- ☆129Updated last month
- Native gui to serveral AI services plus llama.cpp local AIs.☆114Updated last year
- SiLLM simplifies the process of training and running Large Language Models (LLMs) on Apple Silicon by leveraging the MLX framework.☆268Updated last week
- function calling-based LLM agents☆285Updated 8 months ago