containers / ramalamaLinks
RamaLama is an open-source developer tool that simplifies the local serving of AI models from any source and facilitates their use for inference in production, all through the familiar language of containers.
☆2,577Updated this week
Alternatives and similar repositories for ramalama
Users that are interested in ramalama are comparing it to the libraries listed below
Sorting:
- Boot and upgrade via container images☆1,872Updated this week
- InstructLab Core package. Use this to chat with a model and execute the InstructLab workflow to train a model using custom taxonomy data…☆1,408Updated last week
- the terminal client for Ollama☆2,318Updated last month
- Lemonade helps users discover and run local AI apps by serving optimized LLMs right from their own GPUs and NPUs. Join our discord: https…☆2,113Updated this week
- Work with LLMs on a local environment using containers☆278Updated last week
- Reliable model swapping for any local OpenAI/Anthropic compatible server - llama.cpp, vllm, etc☆2,311Updated last week
- Achieve state of the art inference performance with modern accelerators on Kubernetes☆2,465Updated this week
- Tool for interactive command line environments on Linux☆3,224Updated this week
- ☆911Updated this week
- Generate Podman Quadlet files from a Podman command, compose file, or existing object☆1,326Updated this week
- An external provider for Llama Stack allowing for the use of RamaLama for inference.☆20Updated last month
- VS Code extension for LLM-assisted code/text completion☆1,150Updated 3 weeks ago
- A container for deploying bootable container images.☆407Updated this week
- 📖 Runbooks that run☆2,305Updated this week
- Open-source LLM load balancer and serving platform for self-hosting LLMs at scale 🏓🦙 Alternative to projects like llm-d, Docker Model R…☆1,447Updated this week
- Support for bootable OS containers (bootc) and generating disk images☆467Updated this week
- Go manage your Ollama models☆1,666Updated last month
- The next generation Linux workstation, designed for reliability, performance, and sustainability.☆2,304Updated this week
- Proxy that allows you to use ollama as a copilot like Github copilot☆825Updated this week
- One command brings a complete pre-wired LLM stack with hundreds of services to explore.☆2,406Updated this week
- CodeGate: Security, Workspaces and Multiplexing for AI Agentic Frameworks☆712Updated 8 months ago
- Granite Code Models: A Family of Open Foundation Models for Code Intelligence☆1,245Updated 7 months ago
- Podman Desktop is the best free and open source tool to work with Containers and Kubernetes for developers. Get an intuitive and user-fri…☆7,293Updated this week
- An awesome curated knowledge-base about atomic systems☆1,176Updated 5 months ago
- Examples for building and running LLM services and applications locally with Podman☆190Updated 6 months ago
- Build AI agents for your PC☆916Updated this week
- Distributed LLM inference. Connect home devices into a powerful cluster to accelerate LLM inference. More devices means faster inference.☆2,822Updated this week
- Podman Terminal UI☆1,036Updated last week
- SCUDA is a GPU over IP bridge allowing GPUs on remote machines to be attached to CPU-only machines.☆1,803Updated last month
- opensource self-hosted sandboxes for ai agents☆4,718Updated this week