containers / ramalama
The goal of RamaLama is to make working with AI boring.
☆1,540Updated this week
Alternatives and similar repositories for ramalama:
Users that are interested in ramalama are comparing it to the libraries listed below
- Boot and upgrade via container images☆1,155Updated last week
- Work with LLMs on a local environment using containers☆215Updated last week
- InstructLab Core package. Use this to chat with a model and execute the InstructLab workflow to train a model using custom taxonomy data…☆1,253Updated this week
- a text-based terminal client for Ollama☆1,652Updated this week
- Effortlessly run LLM backends, APIs, frontends, and services with one command.☆1,598Updated last week
- Support for bootable OS containers (bootc) and generating disk images☆431Updated this week
- Run any Linux process in a secure, unprivileged sandbox using Landlock. Think firejail, but lightweight, user-friendly, and baked into th…☆1,601Updated 2 weeks ago
- Model swapping for llama.cpp (or any local OpenAPI compatible server)☆544Updated last week
- Stateful load balancer custom-tailored for llama.cpp 🏓🦙☆744Updated this week
- Podman desktop companion☆1,548Updated last week
- LSP-AI is an open-source language server that serves as a backend for AI-powered functionality, designed to assist and empower software e…☆2,700Updated 3 months ago
- Tool for interactive command line environments on Linux☆2,789Updated this week
- SCUDA is a GPU over IP bridge allowing GPUs on remote machines to be attached to CPU-only machines.☆1,699Updated last week
- Taxonomy tree that will allow you to create models tuned with your data☆258Updated this week
- Lilipod is a simple container manager, able to download, unpack and use OCI images from various container registries.☆436Updated 2 weeks ago
- A fast Rust based tool to serialize text-based files in a repository or directory for LLM consumption☆1,984Updated this week
- VS Code extension for LLM-assisted code/text completion☆678Updated last week
- Vim plugin for LLM-assisted code/text completion☆1,367Updated last month
- A powerful document AI question-answering tool that connects to your local Ollama models. Create, manage, and interact with RAG systems f…☆969Updated 3 weeks ago
- An open source DevOps tool for packaging and versioning AI/ML models, datasets, code, and configuration into an OCI artifact.☆776Updated this week
- LLM plugin providing access to models running on an Ollama server☆282Updated 2 weeks ago
- A container for deploying bootable container images.☆191Updated this week
- Generate Podman Quadlet files from a Podman command, compose file, or existing object☆789Updated 5 months ago
- WebAssembly binding for llama.cpp - Enabling on-browser LLM inference☆666Updated last week
- Wireshark for Docker containers☆2,453Updated 2 weeks ago
- Examples for building and running LLM services and applications locally with Podman☆149Updated this week
- AI Inference Operator for Kubernetes. The easiest way to serve ML models in production. Supports VLMs, LLMs, embeddings, and speech-to-te…☆893Updated this week
- Text-To-Speech, RAG, and LLMs. All local!☆1,786Updated 4 months ago
- like jq but for Markdown: find specific elements in a md doc☆1,508Updated this week
- Proxy that allows you to use ollama as a copilot like Github copilot☆629Updated 3 weeks ago