enso-labs / llm-server
🤖 Open-source LLM server (OpenAI, Ollama, Groq, Anthropic) with support for HTTP, Streaming, Agents, RAG
☆32Updated 6 months ago
Alternatives and similar repositories for llm-server
Users that are interested in llm-server are comparing it to the libraries listed below
Sorting:
- A generalist agent that can go online and accomplish complex tasks using semantic-kernel and autogen.☆25Updated last year
- Simple Chainlit UI for running llms locally using Ollama and LangChain☆44Updated last year
- ☆45Updated 11 months ago
- AI Services API: serves langchain, huggingface, & other emergent python AI libraries as a service. This project mainly serves LibreChat, …☆28Updated last year
- AI Agents with Google's Gemini Pro and Gemini Pro Vision Models☆27Updated last year
- Run CrewAI agent workflows on local LLM models with Llamafile and Ollama☆38Updated 11 months ago
- This tool allows you to search ArXiv for scientific papers, extract their content, embed and chunk the text, and ask questions about them…☆29Updated 10 months ago
- On-device real-time RAG App built using Jina Reader, Mediapipe, Gemma 2b IT LLM.☆13Updated last year
- 🌟DataTonic : A Data-Capable AGI-style Agent Builder of Agents , that creates swarms , runs commands and securely processes and creates d…☆87Updated 11 months ago
- Retrieval Augmented Generation-based Agentic CrewAI☆26Updated 6 months ago
- SLIM Models by LLMWare. A streamlit app showing the capabilities for AI Agents and Function Calls.☆20Updated last year
- Simple Chainlit UI for running llms from Groq and LangChain☆17Updated last year
- Access your Ollama inference server running on your computer from anywhere. Set up with NextJS + Langchain JS LCEL + Ngrok