Back to search
Since OpenAI and friends refuse to give us a max_ctx param in /models, here's the current context window, input token and output token limits for OpenAI (API), Anthropic, Qwen, Deepseek, llama, Phi, Gemini and Mistral
Stars
67
Forks
8
Watchers
67
Open Issues
1
Overall repository health assessment
No language data available
No package.json found
This might not be a Node.js project
36
commits