Stars
0
Forks
0
Watchers
0
Open Issues
0
Overall repository health assessment
No package.json found
This might not be a Node.js project
23
commits
feat: send video as multiple frames (1 FPS, max 64) to llama-server
f627209View on GitHubfix: extract video frame as JPEG before sending to llama-server
65d1e7aView on GitHubfix: pass --mmproj to llama-server for vision/video support
6bfb049View on GitHubfix: add video upload support and file loading feedback to simple UI
c274c5cView on GitHubfeat: add simple chat mode (llama-server only, no Docker/backend/frontend)
707b28bView on GitHubfix: use taskkill for process management on Windows
74a2512View on GitHubfix: use rglob to recursively check for files in support model dirs
da9436eView on GitHubfix: check support model dirs are non-empty before skipping download
b0c2d72View on GitHubfix: add direct llama-server path to binary search candidates
78b2121View on GitHubfix: set PYTHONIOENCODING=utf-8 for service subprocesses on Windows
685bb3dView on GitHubfix: handle WinError 87 on print(flush=True) in Windows console
d7ba50fView on GitHubfix: check for support model dirs before skipping model download
f3c3e56View on GitHubfix: UTF-8 encoding for pyproject.toml, robust pnpm resolution
6842af0View on GitHub