AITwinMinds / Ollama-in-Google-Colab
This repository provides a guide on how to use Ollama in Google Colab.
☆29Updated 11 months ago
Alternatives and similar repositories for Ollama-in-Google-Colab:
Users that are interested in Ollama-in-Google-Colab are comparing it to the libraries listed below
- Groq-Whisper Fast Transcription App built using Groq API and Streamlit.☆22Updated 4 months ago
- Examples of RAG using LangChain with local LLMs - Mixtral 8x7B, Llama 2, Mistral 7B, Orca 2, Phi-2, Neural 7B☆37Updated last year
- Jupyter Notebooks for Ollama integration☆123Updated last month
- Simple Chainlit UI for running llms locally using Ollama and LangChain☆41Updated 11 months ago
- Code and resources showcasing the Retrieval-Augmented Generation (RAG) technique, a solution for enhancing data freshness in Large Langua…☆48Updated last year
- This is a RAG implementation using Open Source stack. BioMistral 7B has been used to build this app along with PubMedBert as an embedding…☆64Updated last year
- Your Chatbot Mastery: build a super small custom AI assistant with Gradio_client Python and Streamlit - Chapter 1☆15Updated 9 months ago
- ☆62Updated 9 months ago
- Run CrewAI agent workflows on local LLM models with Llamafile and Ollama☆38Updated 8 months ago
- Allows two LLMs to communicate and run code in the terminal☆21Updated 2 months ago
- Meow meow, dis iz a GitPurr repository of CatGDP fur feline whiskerful conversations. Pawsome, right? Hiss-tory in the making! Happy Cat…☆49Updated 3 months ago
- Democratizing Function Calling Capabilities for Open-Source Language Models☆38Updated 9 months ago
- Question Answer Generation App using Mistral 7B, Langchain, and FastAPI.☆64Updated last year
- How to run a local server on LM Studio☆32Updated 9 months ago
- Haystack and Mistral 7B RAG Implementation. It is based on completely open-source stack.☆79Updated last year
- Simple Chainlit app to have interaction with your documents using different vectorstores.☆25Updated last year
- Function Calling Mistral 7B. Learn how to make functions call for open source LLMs.☆48Updated last year
- llmware RAG Demo App.☆16Updated last year
- Chainlit app for advanced RAG. Uses llamaparse, langchain, qdrant and models from groq.☆40Updated 8 months ago
- ElasticSearch agent based on ElasticSearch, LangChain and ChatGPT 4☆43Updated last year
- ☆45Updated last year
- ☆14Updated 4 months ago
- Agentic RAG using Crew AI.☆24Updated 7 months ago
- Webinterface for administrating Ollama and model Quantization with public endpoints and automized OPENAI proxy☆51Updated 9 months ago
- On-device LLM Inference using Mediapipe LLM Inference API.☆21Updated 10 months ago
- Updating this repo every week, You may want to STAR it :)☆63Updated 6 months ago
- Repo of the code from the Medium article☆20Updated 8 months ago
- SLIM Models by LLMWare. A streamlit app showing the capabilities for AI Agents and Function Calls.☆20Updated last year
- Example project building a data visualization app using Streamlit and LIDA☆74Updated last year