taylorwilsdon / llm-context-limits

Since OpenAI and friends refuse to give us a max_ctx param in /models, here's the current context window, input token and output token limits for OpenAI (API), Anthropic, Qwen, Deepseek, llama, Phi, Gemini and Mistral
34Updated last week

Alternatives and similar repositories for llm-context-limits:

Users that are interested in llm-context-limits are comparing it to the libraries listed below