taylorwilsdon / llm-context-limits
View external linksLinks

Since OpenAI and friends refuse to give us a max_ctx param in /models, here's the current context window, input token and output token limits for OpenAI (API), Anthropic, Qwen, Deepseek, llama, Phi, Gemini and Mistral
65Dec 20, 2025Updated last month

Alternatives and similar repositories for llm-context-limits

Users that are interested in llm-context-limits are comparing it to the libraries listed below

Sorting:

Are these results useful?