minskylab / auto-rustLinks
auto-rust is an experimental project that automatically generate Rust code with LLM (Large Language Models) during compilation, utilizing procedural macros.
☆42Updated 11 months ago
Alternatives and similar repositories for auto-rust
Users that are interested in auto-rust are comparing it to the libraries listed below
Sorting:
- A simple, CUDA or CPU powered, library for creating vector embeddings using Candle and models from Hugging Face☆45Updated last year
- A single-binary, GPU-accelerated LLM server (HTTP and WebSocket API) written in Rust☆79Updated last year
- Library for doing RAG☆77Updated this week
- 🦀 A Pure Rust Framework For Building AGI (WIP).☆107Updated 3 weeks ago
- A set of Rust macros for working with OpenAI function/tool calls.☆54Updated last year
- Fast serverless LLM inference, in Rust.☆94Updated 7 months ago
- Anthropic Rust SDK 🦀 with async support.☆66Updated 8 months ago
- Build tools for LLMs in Rust using Model Context Protocol☆38Updated 8 months ago
- Andrej Karpathy's Let's build GPT: from scratch video & notebook implemented in Rust + candle☆77Updated last year
- Low rank adaptation (LoRA) for Candle.☆162Updated 6 months ago
- Super-simple, fully Rust powered "memory" (doc store + semantic search) for LLM projects, semantic search, etc.☆62Updated 2 years ago
- allms: One Rust Library to rule them aLLMs☆99Updated this week
- AI gateway and observability server written in Rust. Designed to help optimize multi-agent workflows.☆64Updated last year
- bott: Your Terminal Copilot☆88Updated last year
- llm_utils: Basic LLM tools, best practices, and minimal abstraction.☆47Updated 8 months ago
- Deploy dioxus-web to Vercel.☆28Updated last year
- LLaMa 7b with CUDA acceleration implemented in rust. Minimal GPU memory needed!☆110Updated 2 years ago
- Your AI Copilot in Rust☆48Updated last year
- Rust Vector for large amounts of data, that does not copy when growing, by using full `mmap`'d pages.☆22Updated last year
- Bleeding edge low level Rust binding for GGML☆16Updated last year
- Model Context Protocol (MCP) CLI server template for Rust☆80Updated 6 months ago
- Structured outputs for LLMs☆52Updated last year
- Rust library for scheduling, managing resources, and running DAGs 🌙☆34Updated 8 months ago
- Friendly interface to chat with an Ollama instance.☆83Updated last month
- Implementing the BitNet model in Rust☆40Updated last year
- OpenAI compatible API for serving LLAMA-2 model☆218Updated 2 years ago
- ☆37Updated 10 months ago
- A Rust 🦀 port of the Hugging Face smolagents library.☆39Updated 6 months ago
- A tool to extract images from pdf files☆61Updated last year
- Use multiple LLM backends in a single crate, simple builder-based configuration, and built-in prompt chaining & templating.☆137Updated 5 months ago