Local reverse proxy for LLM API calls. Token compression, PII scrubbing, budget enforcement, and full observability in a single Go binary. Works with OpenClaw, Claude Code, Cursor, and any OpenAI/Anthropic-compatible client.
Stars
1
Forks
0
Watchers
1
Open Issues
0
Overall repository health assessment
No package.json found
This might not be a Node.js project
No contributors data available
v1.8.0: Verbose debug/trace logging, streaming token counts, pricing updates
aa818acView on GitHubv1.7.0: Respect client cache_control, enforce Anthropic API limits
e0f9267View on GitHubv1.6.0: Fix cache_control TTL ordering for Anthropic API compatibility
f3365c6View on GitHubv1.5.0: Cache correctness, rate limit fixes, OpenAI fidelity, body storage control
ecdf74eView on GitHubFix 8 bugs: double close, silenced errors, data race, OpenAI field loss, stream DoS, PII exposure, router fallback, store safety
36b4820View on GitHubFix 5 confirmed bugs: context leak, docs mismatch, rate limiting, PII hash, stream DoS
0018bacView on GitHubFix Anthropic API compatibility: forward headers and preserve request body fields
fd3ab6fView on GitHubAdd dist folder with compiled binary and installation guide
53d0dabView on GitHubFix 7 confirmed bugs and correct module path to allaspectsdev/tokenman
dade1bdView on GitHubOpenTelemetry distributed tracing with config-gated activation
a57db26View on GitHubMedium issues: thread-safe providers, error channel drain, DB error logging
45d8bdaView on GitHubPhase 4: Testing & Deployment — full test coverage, hot-reload, Dockerfile
7547200View on GitHubPhase 3: Observability — Prometheus metrics, readiness probe, middleware timing
ac75c61View on GitHub