Your personal engine for running open source models locally. Use Go for hardware accelerated local inference with llama.cpp directly integrated into your Go applications via the yzma module. Kronk provides a high-level API that feels similar to using an OpenAI compatible API. Kronk also provides a model server to run local work
Stars
259
Forks
22
Watchers
259
Open Issues
2
Overall repository health assessment
No package.json found
This might not be a Node.js project
420
commits
27
commits
7
commits
4
commits
3
commits