A high-performance Rust-based proxy server that intelligently routes and load-balances requests between multiple LLM providers (OpenAI, Gemini, Anthropic) with built-in health monitoring and authentication management.
Stars
2
Forks
0
Watchers
2
Open Issues
4
Overall repository health assessment
No package.json found
This might not be a Node.js project
fix(http): preserve keep-alive alignment after chunked trailers (#56)
3ba9efeView on GitHubfeat(openai): support dynamic api_key command and add e2e coverage (#53)
98a3f5bView on GitHubfix: filter client auth headers in HTTP/1.1 when no auth_keys configured (#51)
9e9403aView on GitHubfeat: add optional TLS support for CONNECT tunnel upstream (#50)
69b47b4View on GitHubfix: avoid holding Program lock across HTTP/1.1 request read (#49)
514f50dView on GitHubfix: release RwLock early in HTTP/1.1 proxy to prevent SIGHUP deadlock (#48)
e504f75View on GitHubfix: ensure HTTP/1.1 path rewrite uses auth-selected provider (#47)
f8e2e88View on GitHub