ShelbyJenkins / llm_clientLinks
The Easiest Rust Interface for Local LLMs and an Interface for Deterministic Signals from Probabilistic LLM Vibes
☆219Updated 2 weeks ago
Alternatives and similar repositories for llm_client
Users that are interested in llm_client are comparing it to the libraries listed below
Sorting:
- Fast, streaming indexing, query, and agentic LLM applications in Rust☆525Updated this week
- Use multiple LLM backends in a single crate, simple builder-based configuration, and built-in prompt chaining & templating.☆132Updated 2 months ago
- LLM Orchestrator built in Rust☆280Updated last year
- Rust library for generating vector embeddings, reranking.☆562Updated 3 weeks ago
- Low rank adaptation (LoRA) for Candle.☆152Updated 3 months ago
- High-level, optionally asynchronous Rust bindings to llama.cpp☆226Updated last year
- A powerful Rust library and CLI tool to unify and orchestrate multiple LLM, Agent and voice backends (OpenAI, Claude, Gemini, Ollama, Ele…☆163Updated last week
- A simple, CUDA or CPU powered, library for creating vector embeddings using Candle and models from Hugging Face