AIAnytime / On-device-LLM-Inference-using-MediapipeLinks
On-device LLM Inference using Mediapipe LLM Inference API.
☆22Updated last year
Alternatives and similar repositories for On-device-LLM-Inference-using-Mediapipe
Users that are interested in On-device-LLM-Inference-using-Mediapipe are comparing it to the libraries listed below
Sorting:
- A RAG powered web search with Tavily, LangChain, Mistral AI ( leveraging groq LPU) . The full stack web app build in Databutton.☆36Updated last year
- Medical Mixture of Experts LLM using Mergekit.☆20Updated last year
- SLIM Models by LLMWare. A streamlit app showing the capabilities for AI Agents and Function Calls.☆20Updated last year
- Agentic RAG using Crew AI.☆29Updated last year
- YouTube Video Summarization App built using open source LLM and Framework like Llama 2, Haystack, Whisper, and Streamlit. This app smooth…☆56Updated last year
- llmware RAG Demo App.☆17Updated last year
- Groq goes brrrrr... so had to make a basic Streamlit app you can build upon!