Back to search
> PromptShield is a prompt security middleware for LLMs. It uses a fine-tuned DistilBERT classifier to label incoming prompts as safe, unsafe, suspicious, or jailbreak before forwarding to Google Gemini 2.5 Flash. Built with Python, Transformers, and a vanilla HTML/CSS/JS frontend.
Stars
1
Forks
0
Watchers
1
Open Issues
0
Overall repository health assessment
No package.json found
This might not be a Node.js project
7
commits