tyrell / llm-ollama-llamaindex-bootstrap

Designed for offline use, this RAG application template is based on Andrej Baranovskij's tutorials. It offers a starting point for building your own local RAG pipeline, independent of online APIs and cloud-based LLM services like OpenAI.
40Updated 10 months ago

Alternatives and similar repositories for llm-ollama-llamaindex-bootstrap:

Users that are interested in llm-ollama-llamaindex-bootstrap are comparing it to the libraries listed below