minskylab / auto-rustLinks
auto-rust is an experimental project that automatically generate Rust code with LLM (Large Language Models) during compilation, utilizing procedural macros.
☆40Updated 8 months ago
Alternatives and similar repositories for auto-rust
Users that are interested in auto-rust are comparing it to the libraries listed below
Sorting:
- A single-binary, GPU-accelerated LLM server (HTTP and WebSocket API) written in Rust☆80Updated last year
- Super-simple, fully Rust powered "memory" (doc store + semantic search) for LLM projects, semantic search, etc.☆62Updated last year
- 🦀 A Pure Rust Framework For Building AGI (WIP).☆92Updated this week
- Andrej Karpathy's Let's build GPT: from scratch video & notebook implemented in Rust + candle☆73Updated last year
- A set of Rust macros for working with OpenAI function/tool calls.☆50Updated last year
- Fast serverless LLM inference, in Rust.☆88Updated 5 months ago
- bott: Your Terminal Copilot☆87Updated last year
- Anthropic Rust SDK 🦀 with async support.☆63Updated 5 months ago
- A tool to extract images from pdf files☆61Updated last year
- A simple, CUDA or CPU powered, library for creating vector embeddings using Candle and models from Hugging Face☆37Updated last year
- Low rank adaptation (LoRA) for Candle.☆152Updated 3 months ago
- Friendly interface to chat with an Ollama instance.☆74Updated 2 weeks ago
- Rust Vector for large amounts of data, that does not copy when growing, by using full `mmap`'d pages.☆22Updated last year
- allms: One Rust Library to rule them aLLMs☆89Updated this week
- Library for doing RAG☆74Updated last week
- Build tools for LLMs in Rust using Model Context Protocol☆38Updated 5 months ago
- The Easiest Rust Interface for Local LLMs and an Interface for Deterministic Signals from Probabilistic LLM Vibes☆219Updated last week
- Implementing the BitNet model in Rust☆38Updated last year
- Your AI Copilot in Rust☆48Updated last year
- AI gateway and observability server written in Rust. Designed to help optimize multi-agent workflows.☆63Updated last year
- ☆37Updated 8 months ago
- LLaMa 7b with CUDA acceleration implemented in rust. Minimal GPU memory needed!☆108Updated 2 years ago
- Model Context Protocol (MCP) CLI server template for Rust☆78Updated 3 months ago
- llm_utils: Basic LLM tools, best practices, and minimal abstraction.☆46Updated 5 months ago
- A distributed execution framework built upon lunatic.☆16Updated last year
- Rust library for scheduling, managing resources, and running DAGs 🌙☆33Updated 6 months ago
- OpenAI compatible API for serving LLAMA-2 model☆218Updated last year
- CrabGrab+Tauri Example App☆57Updated last year
- Core library for computer graphics & vision applications☆101Updated last year
- Bleeding edge low level Rust binding for GGML☆16Updated last year