Back to search
A local AI development setup using Ollama and LLaMA 2 on Mac, allowing you to run LLaMA 2 fully offline for coding assistance, natural language queries, and experimentation without relying on cloud APIs.
Stars
0
Forks
0
Watchers
0
Open Issues
0
Overall repository health assessment
No package.json found
This might not be a Node.js project
3
commits