edgenai / edgen
⚡ Edgen: Local, private GenAI server alternative to OpenAI. No GPU required. Run AI models locally: LLMs (Llama2, Mistral, Mixtral...), Speech-to-text (whisper) and many others.
☆339Updated 6 months ago
Related projects ⓘ
Alternatives and complementary repositories for edgen
- The Easiest Rust Interface for Local LLMs and an Interface for Deterministic Signals from Probabilistic LLM Vibes☆133Updated last month
- Library for generating vector embeddings, reranking in Rust☆308Updated this week
- OpenAI compatible API for serving LLAMA-2 model☆215Updated last year
- ☆166Updated this week
- Fast, streaming indexing and query library for AI (RAG) applications, written in Rust☆259Updated this week
- 🤖 TUI interface for LLMs written in Rust☆374Updated 2 months ago
- Hybrid vector database with flexible SQL storage engine & multi-index support.☆359Updated last week
- ☆136Updated 9 months ago
- A multi-platform desktop application to evaluate and compare LLM models, written in Rust and React.☆497Updated this week
- A single-binary, GPU-accelerated LLM server (HTTP and WebSocket API) written in Rust☆80Updated 10 months ago
- Open source alternative to Perplexity AI with ability to run locally☆150Updated last month
- High-performance key-value store for ML inference. 100x faster than Redis.☆210Updated 6 months ago
- A blazing-fast, GPU-accelerated screen capture tool written in Rust. Select, crop, and copy screen regions with pixel-perfect accuracy us…☆154Updated 2 weeks ago
- Super-simple, fully Rust powered "memory" (doc store + semantic search) for LLM projects, semantic search, etc.☆55Updated last year
- A Rust implementation of OpenAI's Whisper model using the burn framework☆271Updated 6 months ago
- AI gateway and observability server written in Rust. Designed to help optimize multi-agent workflows.☆46Updated 4 months ago
- Efficent platform for inference and serving local LLMs including an OpenAI compatible API server.☆265Updated this week
- A cross-platform browser ML framework.☆627Updated this week
- Rust implementation of Surya☆52Updated last month
- Tera is an AI assistant which is tailored just for you and runs fully locally.☆61Updated 9 months ago
- Low rank adaptation (LoRA) for Candle.☆127Updated 3 months ago
- High-level, optionally asynchronous Rust bindings to llama.cpp☆179Updated 5 months ago
- bott: Your Terminal Copilot☆85Updated 9 months ago
- Llama2 LLM ported to Rust burn☆274Updated 7 months ago
- LLM Orchestrator built in Rust☆267Updated 8 months ago
- Multi-platform desktop app to download and run Large Language Models(LLM) locally in your computer.☆266Updated last year
- Run any ML model from any programming language.☆421Updated 10 months ago
- Inference Llama 2 in one file of pure Rust 🦀☆229Updated last year
- 🦀 🖥️ 🦀☆152Updated 3 months ago
- Models and examples built with Burn☆186Updated last week