rapidarchitect / ollama-crew-mesop
☆26Updated 6 months ago
Alternatives and similar repositories for ollama-crew-mesop:
Users that are interested in ollama-crew-mesop are comparing it to the libraries listed below
- Experience the power of AI with this free AI voice generator demo. Utilizing Deepgram and Groq, we transform text into voice seamlessly. …☆37Updated 9 months ago
- WhisperAnywhere: Effortless speech-to-text everywhere on your Mac. Use a hotkey to dictate in any app, powered by Whisper AI and Groq API…☆25Updated 6 months ago
- Rivet plugin for integration with Ollama, the tool for running LLMs locally easily☆36Updated 10 months ago
- A generalist agent that can go online and accomplish complex tasks using semantic-kernel and autogen.☆25Updated last year
- ChatGPT-like interface for working with AI Agents☆20Updated 6 months ago
- ☆8Updated last year
- 🧠 Mem4AI: A LLM Friendly memory management library.☆20Updated 4 months ago
- Terminal Voice Assistant is a powerful and flexible tool designed to help users interact with their terminal using natural language comma…☆18Updated 9 months ago
- ☆13Updated 3 weeks ago
- Text with Open Interpreter, running locally on your Mac. Credit: Morisy☆20Updated last year
- This repository provides resources and guidelines to facilitate the integration of Open-WebUI and Langfuse, enabling seamless monitoring …☆32Updated 4 months ago
- ☆10Updated last year
- Web UI for gptme, built with lovable.dev☆24Updated this week
- Branch Out Your Conversations☆32Updated 2 months ago
- Modern AI chatbot supporting multiple LLMs. Switch between Gemini, Mistral, Llama, Claude and ChatGPT.☆54Updated last week
- ☆34Updated 3 months ago
- Local LLMs in One Line Of Code (thanks to llamafile)☆39Updated last year
- fork of litellm that is open source☆18Updated 3 months ago
- ☆15Updated 2 months ago
- A Free and Opensource alternative to OpenAI GPT-4 Plus packed with a powerful code-interpreter and gpt-4-vision .☆57Updated last year
- Streamlit Web UI for AGiXT☆26Updated 3 weeks ago
- LLM Use Case: LLM Powered, Reusable, Domain Agnostic Autocompletes☆57Updated 10 months ago
- Access your Ollama inference server running on your computer from anywhere. Set up with NextJS + Langchain JS LCEL + Ngrok☆26Updated last year
- ☆20Updated last year
- ☆37Updated last year
- This Repository have a simple UI to work with LiteLLM API through UI☆18Updated 11 months ago
- LlamaCards is a web application that provides a dynamic interface for interacting with LLM models in real-time. This app allows users to …☆38Updated 6 months ago