fau-masters-collected-works-cgarbin / gpt-all-local

A "chat with your data" example: using a large language models (LLM) to interact with our own (local) data. Everything is local: the embedding model, the LLM, the vector database. This is an example of retrieval-augmented generation (RAG): we find relevant sections from our documents and pass it to the LLM as part of the prompt (see pics).
22Updated 7 months ago

Related projects

Alternatives and complementary repositories for gpt-all-local