developersdigest / ollama-anywhere

Access your Ollama inference server running on your computer from anywhere. Set up with NextJS + Langchain JS LCEL + Ngrok
25Updated 9 months ago

Related projects

Alternatives and complementary repositories for ollama-anywhere