tyrell / llm-ollama-llamaindex-bootstrap

Designed for offline use, this RAG application template offers a starting point for building your own local RAG pipeline, independent of online APIs and cloud-based LLM services like OpenAI.
41Updated 11 months ago

Alternatives and similar repositories for llm-ollama-llamaindex-bootstrap:

Users that are interested in llm-ollama-llamaindex-bootstrap are comparing it to the libraries listed below