The original local LLM interface. Text, vision, tool-calling, training, and more. 100% offline.
Stars
46.4k
Forks
5.9k
Watchers
46.4k
Open Issues
770
Overall repository health assessment
No package.json found
This might not be a Node.js project
4.5k
commits
109
commits
87
commits
39
commits
28
commits
24
commits
20
commits
19
commits
16
commits
13
commits
Revert "API: Add warning about vanilla llama-server not supporting prompt logprobs + instructions"
95d6c53View on GitHubFix crash when no model is selected (None passed to resolve_model_path)
66d1a22View on GitHubRevert "llama.cpp: Disable jinja by default (we use Python jinja, not cpp jinja)"
000d776View on GitHubllama.cpp: Disable jinja by default (we use Python jinja, not cpp jinja)
a1cb5b5View on GitHubAPI: Add warning about vanilla llama-server not supporting prompt logprobs + instructions
42dfcdfView on GitHubFix portable builds not starting due to missing ik element
b108c55View on GitHub