Back to search
🛡️ Open-source runtime security for LLM applications. Detect and block prompt injection, PII leaks, and sensitive data exposure before responses are generated.
Stars
2
Forks
0
Watchers
2
Open Issues
0
Overall repository health assessment
No package.json found
This might not be a Node.js project
No contributors data available
v0.1.1: add Guard modes and sanitize prompt injection output
ed68c2eView on GitHub