liltom-eth / llama2-webui

Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Use `llama2-wrapper` as your local llama2 backend for Generative Agents/Apps.
1,962Updated 10 months ago

Alternatives and similar repositories for llama2-webui:

Users that are interested in llama2-webui are comparing it to the libraries listed below