containers / ramalamaLinks
RamaLama is an open-source developer tool that simplifies the local serving of AI models from any source and facilitates their use for inference in production, all through the familiar language of containers.
☆2,425Updated this week
Alternatives and similar repositories for ramalama
Users that are interested in ramalama are comparing it to the libraries listed below
Sorting:
- Boot and upgrade via container images☆1,769Updated this week
- Reliable model swapping for any local OpenAI/Anthropic compatible server - llama.cpp, vllm, etc☆2,086Updated last week
- InstructLab Core package. Use this to chat with a model and execute the InstructLab workflow to train a model using custom taxonomy data…☆1,394Updated this week
- the terminal client for Ollama☆2,294Updated last week
- Work with LLMs on a local environment using containers☆272Updated last week
- Lemonade helps users discover and run local AI apps by serving optimized LLMs right from their own GPUs and NPUs. Join our discord: https…☆1,920Updated this week
- VS Code extension for LLM-assisted code/text completion☆1,114Updated last month
- Open-source LLM load balancer and serving platform for self-hosting LLMs at scale 🏓 🦙☆1,406Updated this week
- ☆695Updated last week
- An external provider for Llama Stack allowing for the use of RamaLama for inference.☆20Updated last week
- A container for deploying bootable container images.☆397Updated 2 weeks ago
- The next generation Linux workstation, designed for reliability, performance, and sustainability.☆2,185Updated this week
- Proxy that allows you to use ollama as a copilot like Github copilot☆796Updated 3 months ago
- A powerful document AI question-answering tool that connects to your local Ollama models. Create, manage, and interact with RAG systems f…☆1,092Updated 4 months ago
- Tool for interactive command line environments on Linux☆3,168Updated 2 weeks ago
- Examples for building and running LLM services and applications locally with Podman☆188Updated 4 months ago
- Support for bootable OS containers (bootc) and generating disk images☆466Updated this week
- Go manage your Ollama models☆1,624Updated 2 weeks ago
- Generate Podman Quadlet files from a Podman command, compose file, or existing object☆1,235Updated last year
- Taxonomy tree that will allow you to create models tuned with your data☆287Updated 3 months ago
- Local CLI Copilot, powered by Ollama. 💻🦙☆1,462Updated 2 months ago
- Podman desktop companion☆1,577Updated last month
- Effortlessly run LLM backends, APIs, frontends, and services with one command.☆2,209Updated last week
- A multi-platform desktop application to evaluate and compare LLM models, written in Rust and React.☆892Updated last month
- Podman Desktop is the best free and open source tool to work with Containers and Kubernetes for developers. Get an intuitive and user-fri…☆7,133Updated this week
- Red Hat Enterprise Linux AI -- Developer Preview☆169Updated last year
- Granite Code Models: A Family of Open Foundation Models for Code Intelligence☆1,244Updated 6 months ago
- llama.cpp fork with additional SOTA quants and improved performance☆1,407Updated this week
- Communicate with an LLM provider using a single interface☆1,519Updated this week
- Inference engine for Intel devices. Serve LLMs, VLMs, Whisper, Kokoro-TTS, Embedding and Rerank models over OpenAI endpoints.☆267Updated this week