This is a project that unifies the management of LLM APIs. It can call multiple backend services through a unified API interface, convert them to the OpenAI format uniformly, and support load balancing. Currently supported backend services include: OpenAI, Anthropic, DeepBricks, OpenRouter, Gemini, Vertex, etc.
Stars
1.2k
Forks
151
Watchers
1.2k
Open Issues
11
Overall repository health assessment
No package.json found
This might not be a Node.js project
838
commits
574
commits
6
commits
3
commits
2
commits
2
commits
2
commits
1
commits
1
commits
1
commits
feat: add rate limit retry-after detection and dynamic cooling time
273fa83View on GitHubtest: add tests for deep merge without mutation and gemini model aliasing
5dc4587View on GitHubfeat: add error status code mapping and HTTP exception handling for stream failures
b535aafView on GitHubfeat: add stream priming for Responses API to buffer preflight events
9cce7ccView on GitHub