dcSpark-AI / open-LLM-serverLinks
Run local LLMs via HTTP API in a single command (Windows/Mac/Linux)
☆61Updated 2 years ago
Alternatives and similar repositories for open-LLM-server
Users that are interested in open-LLM-server are comparing it to the libraries listed below
Sorting:
- Super-simple, fully Rust powered "memory" (doc store + semantic search) for LLM projects, semantic search, etc.☆64Updated 2 years ago
- Harnessing the Memory Power of the Camelids☆147Updated 2 years ago
- CrustAGI is an Task-driven Autonomous Agent experiment written in Rust☆45Updated 2 years ago
- OpenAI compatible API for serving LLAMA-2 model☆218Updated 2 years ago
- Multi-platform desktop app to download and run Large Language Models(LLM) locally in your computer.☆290Updated 2 years ago
- Unofficial python bindings for the rust llm library. 🐍❤️🦀☆76Updated 2 years ago
- Load local LLMs effortlessly in a Jupyter notebook for testing purposes alongside Langchain or other agents. Contains Oobagooga and Kobol…☆213Updated 2 years ago
- An autonomous AI agent extension for Oobabooga's web ui☆173Updated 2 years ago
- An Autonomous LLM Agent that runs on Wizcoder-15B☆334Updated last year
- Run inference on replit-3B code instruct model using CPU☆160Updated 2 years ago
- BabyAGI to run with locally hosted models using the API from https://github.com/oobabooga/text-generation-webui☆88Updated 2 years ago
- Landmark Attention: Random-Access Infinite Context Length for Transformers QLoRA☆124Updated 2 years ago
- Real-time Fallacy Detection using OpenAI whisper and ChatGPT/LLaMA/Mistral☆116Updated 2 years ago
- Like system requirements lab but for LLMs☆31Updated 2 years ago
- GPT-2 small trained on phi-like data☆67Updated last year
- A fast batching API to serve LLM models☆189Updated last year
- Local LLM ReAct Agent with Guidance☆159Updated 2 years ago
- Falcon LLM ggml framework with CPU and GPU support☆248Updated last year
- A simple and clear way of hosting llama.cpp as a private HTTP API using Rust☆27Updated last year
- An easy way to host your own AI API and expose alternative models, while being compatible with "open" AI clients.☆332Updated last year
- Experimental LLM Inference UX to aid in creative writing☆127Updated last year
- AI stack for interacting with LLMs, Stable Diffusion, Whisper, xTTS and many other AI models☆168Updated last year
- ♾️ toolkit for air-gapped LLMs on consumer-grade hardware☆226Updated 2 years ago
- ☆137Updated 2 years ago
- TheBloke's Dockerfiles☆308Updated last year
- Edge full-stack LLM platform. Written in Rust☆382Updated last year
- Use local llama LLM or openai to chat, discuss/summarize your documents, youtube videos, and so on.☆154Updated last year
- A prompt/context management system☆170Updated 2 years ago
- ☆216Updated 2 years ago
- llama.cpp with BakLLaVA model describes what does it see☆379Updated 2 years ago