AIAnytime / LLM-Inference-API-in-Rust
LLM Inference API in Rust. It also has a streamlit app that requests the running API in Rust.
☆20Updated last year
Alternatives and similar repositories for LLM-Inference-API-in-Rust:
Users that are interested in LLM-Inference-API-in-Rust are comparing it to the libraries listed below
- On-device LLM Inference using Mediapipe LLM Inference API.☆21Updated last year
- A quick Crew AI tutorial☆23Updated 11 months ago
- Ask shortgpt for instant and concise answers☆13Updated last year
- Light WebUI for lm.rs☆23Updated 6 months ago
- ⚡️Lightning fast in-memory VectorDB written in rust🦀☆20Updated last month
- Here is a collection of cool applications that I've built with AssemblyAI☆35Updated 8 months ago
- Self-hosted Solution for Running AI-Generated Code in Secure Sandboxes☆53Updated this week
- Medical Mixture of Experts LLM using Mergekit.☆20Updated last year
- Using langchain, deeplake and openai to create a Q&A on the Mojo lang programming manual☆22Updated last year
- Various simple code examples utilising large language models's and related tools☆11Updated 10 months ago
- VSCode Copilot for Groq fans!☆41Updated 9 months ago
- ☆42Updated last year
- ☆65Updated last year
- 🦀 A Pure Rust Framework For Building AGI (WIP).☆70Updated this week
- Super-simple, fully Rust powered "memory" (doc store + semantic search) for LLM projects, semantic search, etc.☆58Updated last year
- Run Python functions on desktop, mobile, web, and in the cloud. https://fxn.ai/explore☆57Updated last week
- AI Assistant☆20Updated last week
- Task management for AI agents☆14Updated this week
- ☆44Updated 9 months ago
- The very first artist assistant☆20Updated last year
- Function Calling Mistral 7B. Learn how to make functions call for open source LLMs.☆48Updated last year
- Aitino is a platform that allows for the creation of crews of AI Agents to help you automate tasks and solve complex problems.☆82Updated last month
- 👁️ Multimodal LLM vision multitool☆26Updated 6 months ago
- 🤖📝 A markdown editor powered by AI (Ollama)☆63Updated 6 months ago
- AI Agents with Google's Gemini Pro and Gemini Pro Vision Models☆27Updated last year
- Simple Chainlit UI for running llms locally using Ollama and LangChain☆44Updated last year
- ☆59Updated last year
- A high performance batching router optimises max throughput for text inference workload☆16Updated last year
- ☆38Updated last year
- a Rust library designed for building and managing generative AI agents, leveraging the capabilities of large language models (LLMs)☆19Updated this week