OpenLLM Monitor is a plug-and-play, real-time observability dashboard for monitoring and debugging LLM API calls across OpenAI, Ollama, OpenRouter, and more. Tracks tokens, latency, cost, retries, and lets you replay prompts — fully open-source and self-hostable.
Stars
35
Forks
6
Watchers
35
Open Issues
1
Overall repository health assessment
No package.json found
This might not be a Node.js project
43
commits
feat(replay): add Gemini and Grok provider support to replay controller
6172e25View on GitHubfeat(replay-zone): improve similarity metric and model selection UX
c33fe2dView on GitHubfeat(tokens): ensure accurate async token counting for all models
4bc7cf9View on GitHubperf(replay): optimize response times and eliminate hanging requests
89a7bc2View on GitHubfeat: add empty state overlays for dashboard charts with
f37e1ddView on GitHub