tyrell / llm-ollama-llamaindex-bootstrap

Designed for offline use, this RAG application template is based on Andrej Baranovskij's tutorials. It offers a starting point for building your own local RAG pipeline, independent of online APIs and cloud-based LLM services like OpenAI.
38Updated 8 months ago

Related projects

Alternatives and complementary repositories for llm-ollama-llamaindex-bootstrap