tokenlean chains aip-proxy and LiteLLM in front of the GitHub Copilot API so that Claude Code (or any OpenAI-compatible tool) can use your Copilot subscription as a backend. It also integrates rtk as a Claude Code hook to compress command outputs, delivering two independent layers of token reduction.
Stars
2
Forks
0
Watchers
2
Open Issues
0
Overall repository health assessment
No package.json found
This might not be a Node.js project
feat: add installation targets for jq and update Makefile for AIP Proxy
7f35f95View on GitHubfeat(configure_claude): add shell path injection for VS Code compatibility
606d289View on GitHubfix(pyproject): update aip-proxy dependency to version constraint
5c50ce6View on GitHubfeat: add AIP Proxy for token compression between AI IDEs and LLM APIs
6250885View on GitHubfix(render_aip): enhance statistics display with additional metrics and formatting
c2e959cView on GitHubfix(makefile): bootstrap python3, poetry and npm on clean Linux installs
c24cbceView on GitHubfeat: enhance aip-proxy startup with readiness check and logging
0701d9aView on GitHubfix: correct test numbering for OpenAI API compatibility in test_docker.sh
d61aa2aView on GitHubMerge branch 'develop' of me.github.com:futesat/tokenlean into develop
631a3c8View on GitHub