tyrell / llm-ollama-llamaindex-bootstrap
View external linksLinks

Designed for offline use, this RAG application template offers a starting point for building your own local RAG pipeline, independent of online APIs and cloud-based LLM services like OpenAI.
47Feb 23, 2024Updated last year

Alternatives and similar repositories for llm-ollama-llamaindex-bootstrap

Users that are interested in llm-ollama-llamaindex-bootstrap are comparing it to the libraries listed below

Sorting:

Are these results useful?