enescingoz / colab-llmLinks

This repository provides a ready-to-use Google Colab notebook that turns Colab into a temporary server for running local LLM models using Ollama. It exposes the model API via a secure Cloudflare tunnel, allowing remote access from tools like curl or ROO Code in VS Code — no server setup or cloud deployment required.
56Updated last month

Alternatives and similar repositories for colab-llm

Users that are interested in colab-llm are comparing it to the libraries listed below

Sorting: