LLMrv is a framework for monitoring LLM conversations against formal safety policies in real time. It models conversations as event traces, specifies policies in past-time temporal logic (ptLTL), and bridges the gap between formal Boolean semantics and free-form natural language through a semantic grounding layer.
Stars
2
Forks
0
Watchers
2
Open Issues
0
Overall repository health assessment
No package.json found
This might not be a Node.js project
feat: Add built-in proposition 'user_turn' and update related tests
22a34fcView on GitHubfeat: Enhance grounding prompt functionality and few-shot example generation
2451ad9View on GitHubRefine user data access claims in grounding dataset for clarity
565baf7View on GitHubUpdate semantic_anchor_generator.py anchors generator mechanizm
264bbffView on GitHubUpdate the semantic_anchor_generator.py to wotk with simple KNN
ded7c0cView on GitHub