graniet / kheish
Kheish: A multi-role LLM agent for tasks like code auditing, file searching, and more—seamlessly leveraging RAG and extensible modules.
☆139Updated 4 months ago
Alternatives and similar repositories for kheish
Users that are interested in kheish are comparing it to the libraries listed below
Sorting:
- The Easiest Rust Interface for Local LLMs and an Interface for Deterministic Signals from Probabilistic LLM Vibes☆199Updated 3 months ago
- This repository has code for fine-tuning LLMs with GRPO specifically for Rust Programming using cargo as feedback☆86Updated 2 months ago
- git-like rag pipeline☆206Updated this week
- Built for demanding AI workflows, this gateway offers low-latency, provider-agnostic access, ensuring your AI applications run smoothly a…☆63Updated 2 months ago
- Rust implementation of Surya☆58Updated 2 months ago
- A memory framework for Large Language Models and Agents.☆178Updated 4 months ago
- Use multiple LLM backends in a single crate, simple builder-based configuration, and built-in prompt chaining & templating.☆124Updated this week
- Super-simple, fully Rust powered "memory" (doc store + semantic search) for LLM projects, semantic search, etc.☆59Updated last year
- A Pure Rust based LLM (Any LLM based MLLM such as Spark-TTS) Inference Engine, powering by Candle framework.☆112Updated last month
- Burn through tech debt with AI agents!☆236Updated this week
- Library for doing RAG☆72Updated 2 weeks ago
- Fast, streaming indexing, query, and agentic LLM applications in Rust☆482Updated this week
- AI Assistant☆20Updated last month
- Open source alternative to Perplexity AI with ability to run locally☆203Updated 7 months ago
- Generate ideal question-answers for testing RAG☆126Updated 2 months ago
- A minimal implementation of GraphRAG, designed to quickly prototype whether you're able to get good sense-making out of a large dataset w…☆28Updated 3 months ago
- ChronoMind: Redefining Vector Intelligence Through Time.☆68Updated 2 weeks ago
- A fork of OpenAI Swarm that supports Groq and Anthropic☆119Updated 3 months ago
- OpenAI compatible API for serving LLAMA-2 model☆218Updated last year
- Split code into semantic chunks☆24Updated 7 months ago
- native OCR for MacOS, Windows, Linux☆168Updated last month
- Minimalistic Rust Implementation Of Model Context Protocol from Anthropic☆56Updated 2 months ago
- 🦛 CHONK your texts with Chonkie ✨ - The no-nonsense RAG chunking library☆36Updated 6 months ago
- A simple, CUDA or CPU powered, library for creating vector embeddings using Candle and models from Hugging Face☆37Updated last year
- Model Context Protocol (MCP) CLI server template for Rust☆75Updated last month
- Sidecar is the AI brains for the Aide editor and works alongside it, locally on your machine☆552Updated this week
- Chrome extension for running Generative AI in your browser locally☆78Updated 3 months ago
- Build, Improve Performance, and Productionize your LLM Application with an Integrated Framework☆339Updated 5 months ago
- A single-binary, GPU-accelerated LLM server (HTTP and WebSocket API) written in Rust☆80Updated last year
- The hearth of The Pulsar App, fast, secure and shared inference with modern UI☆56Updated 5 months ago