Maximilian-Winter / llama-cpp-agent
The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). Allowing users to chat with LLM models, execute structured function calls and get structured output. Works also with models not fine-tuned to JSON output and function calls.
☆493Updated 3 months ago
Related projects ⓘ
Alternatives and complementary repositories for llama-cpp-agent
- An application for running LLMs locally on your device, with your documents, facilitating detailed citations in generated responses.☆498Updated 3 weeks ago
- ☆718Updated 2 months ago
- function calling-based LLM agents☆278Updated 2 months ago
- 🚀 Retrieval Augmented Generation (RAG) with txtai. Combine search and LLMs to find insights with your own data.☆281Updated this week
- A multimodal, function calling powered LLM webui.☆208Updated last month
- A python package for developing AI applications with local LLMs.☆140Updated 4 months ago
- Your Trusty Memory-enabled AI Companion - Simple RAG chatbot optimized for local LLMs | 12 Languages Supported | OpenAI API Compatible☆263Updated 2 months ago
- A fast batching API to serve LLM models☆172Updated 6 months ago
- SiLLM simplifies the process of training and running Large Language Models (LLMs) on Apple Silicon by leveraging the MLX framework.☆226Updated this week
- An AI assistant beyond the chat box.☆315Updated 8 months ago
- A python application that routes incoming prompts to an LLM by category, and can support a single incoming connection from a front end to…☆167Updated this week
- Efficient visual programming for AI language models☆299Updated 2 months ago
- Automatically evaluate your LLMs in Google Colab☆559Updated 6 months ago
- A Python-based web-assisted large language model (LLM) search assistant using Llama.cpp☆262Updated 3 weeks ago
- ☆149Updated 4 months ago
- Generate Synthetic Data Using OpenAI, MistralAI or AnthropicAI☆221Updated 6 months ago
- Task-based Agentic Framework using StrictJSON as the core☆436Updated last month
- ☆128Updated this week
- FastMLX is a high performance production ready API to host MLX models.☆218Updated 3 weeks ago
- Open source LLM UI, compatible with all local LLM providers.☆167Updated 2 months ago
- Large-scale LLM inference engine☆1,134Updated this week
- Stateful load balancer custom-tailored for llama.cpp☆563Updated this week
- Software to implement GoT with a weviate vectorized database☆635Updated 5 months ago
- Dataset Crafting w/ RAG/Wikipedia ground truth and Efficient Fine-Tuning Using MLX and Unsloth. Includes configurable dataset annotation …☆162Updated 4 months ago
- A tool for generating function arguments and choosing what function to call with local LLMs☆340Updated 8 months ago
- Web UI for ExLlamaV2☆445Updated last month
- Simple Python library/structure to ablate features in LLMs which are supported by TransformerLens☆333Updated 5 months ago
- MLX-VLM is a package for running Vision LLMs locally on your Mac using MLX.☆496Updated this week
- Convert Compute And Books Into Instruct-Tuning Datasets! Makes: QA, RP, Classifiers.☆1,033Updated last week
- Open Source Application for Advanced LLM Engineering: interact, train, fine-tune, and evaluate large language models on your own computer…☆416Updated this week