enescingoz / colab-llmView on GitHub
This repository provides a ready-to-use Google Colab notebook that turns Colab into a temporary server for running local LLM models using Ollama. It exposes the model API via a secure Cloudflare tunnel, allowing remote access from tools like curl or ROO Code in VS Code — no server setup or cloud deployment required.
118Apr 14, 2025Updated 10 months ago

Alternatives and similar repositories for colab-llm

Users that are interested in colab-llm are comparing it to the libraries listed below

Sorting:

Are these results useful?