graniet / kheishLinks
Kheish: A multi-role LLM agent for tasks like code auditing, file searching, and more seamlessly leveraging RAG and extensible modules.
☆141Updated 8 months ago
Alternatives and similar repositories for kheish
Users that are interested in kheish are comparing it to the libraries listed below
Sorting:
- Built for demanding AI workflows, this gateway offers low-latency, provider-agnostic access, ensuring your AI applications run smoothly a…☆76Updated 3 months ago
- git-like rag pipeline☆244Updated this week
- The Easiest Rust Interface for Local LLMs and an Interface for Deterministic Signals from Probabilistic LLM Vibes☆235Updated last month
- This repository has code for fine-tuning LLMs with GRPO specifically for Rust Programming using cargo as feedback☆104Updated 6 months ago
- Rust implementation of Surya☆60Updated 6 months ago
- Super-simple, fully Rust powered "memory" (doc store + semantic search) for LLM projects, semantic search, etc.☆62Updated last year
- Fast serverless LLM inference, in Rust.☆92Updated 6 months ago
- Burn through tech debt with AI agents!☆302Updated this week
- AI Assistant☆20Updated 5 months ago
- Library for doing RAG☆76Updated last week
- High-performance framework for building interactive multi-agent workflow systems in Rust☆141Updated last week
- A Pure Rust based LLM (Any LLM based MLLM such as Spark-TTS) Inference Engine, powering by Candle framework.☆159Updated last month
- A memory framework for Large Language Models and Agents.☆183Updated 8 months ago
- Use multiple LLM backends in a single crate, simple builder-based configuration, and built-in prompt chaining & templating.☆135Updated 4 months ago
- 🦀 A Pure Rust Framework For Building AGI (WIP).☆102Updated last week
- A single-binary, GPU-accelerated LLM server (HTTP and WebSocket API) written in Rust☆79Updated last year
- OpenAI compatible API for serving LLAMA-2 model☆218Updated last year
- A simple, CUDA or CPU powered, library for creating vector embeddings using Candle and models from Hugging Face☆39Updated last year
- ⚡ Edgen: Local, private GenAI server alternative to OpenAI. No GPU required. Run AI models locally: LLMs (Llama2, Mistral, Mixtral...), …☆367Updated last year
- ChronoMind: Redefining Vector Intelligence Through Time.☆72Updated 4 months ago
- The AI router☆334Updated this week
- A Fish Speech implementation in Rust, with Candle.rs☆97Updated 3 months ago
- A fork of OpenAI Swarm that supports Groq and Anthropic☆122Updated 7 months ago
- AI gateway and observability server written in Rust. Designed to help optimize multi-agent workflows.☆63Updated last year
- ⚡️Lightning fast in-memory VectorDB written in rust🦀☆25Updated 6 months ago
- Open source alternative to Perplexity AI with ability to run locally☆216Updated 11 months ago
- Fast, streaming indexing, query, and agentic LLM applications in Rust☆556Updated last week
- A minimal implementation of GraphRAG, designed to quickly prototype whether you're able to get good sense-making out of a large dataset w…☆37Updated 7 months ago
- llmbasedos — Local-First OS Where Your AI Agents Wake Up and Work☆273Updated last month
- CursorCore: Assist Programming through Aligning Anything☆131Updated 7 months ago