High-performance In-browser LLM Inference Engine
Stars
17.7k
Forks
1.2k
Watchers
17.7k
Open Issues
154
Overall repository health assessment
^1.9.1^3.3.1^9.9.0^0.24.0-dev2^0.1.60.1.27^16.0.0^29.0.0^16.0.3^12.3.0^0.0.266^30.0.0^0.0.86^0.1.24^5.7.1^9.39.1^10.1.8^5.5.4^9.0.11^30.2.03.6.2^0.11.10^4.53.3^1.0.10^29.4.6^2.3.1^5.9.3^8.47.0192
commits
46
commits
28
commits
27
commits
25
commits
23
commits
13
commits
7
commits
5
commits
4
commits
[Build] Fix browser bundling of node: prefixed imports (#811)
4e8f8d4View on GitHub[Examples] Upgrade miscellaneous dependencies (#809)
f964da6View on GitHub[ABI] Refactor llm_chat.ts function registration to be kv state-aware (#803)
c6f6536View on GitHubFix rollup issues and upgrade dependencies (#796)
a9f37f3View on GitHub[Feat] Support integrity verification for model artifacts (#787)
345922bView on GitHub[Experimental Feature] Add support for cross-origin storage (#748)
5cb898aView on GitHub[Dependencies] Bump @mlc-ai/web-runtime version to 0.24.0-dev2 (#794)
ce80a1dView on GitHub[CI] Add CI for tests, build, and security (#782)
5acf291View on GitHub[Chrome Extensions] Add new HF domain to content security policy (#769)
e153315View on GitHub[XGrammar] Add structural tag support and example (#756)
4ad316bView on GitHub