Enterprise RAG ecosystem managing 15,000+ semantic chunks. Features hybrid parsing (LlamaParse/PyMuPDF) and 256-dim MRL embeddings for 512MB RAM environments
Stars
47
Forks
21
Watchers
47
Open Issues
1
Overall repository health assessment
No package.json found
This might not be a Node.js project
No contributors data available
revert: model experiment (back to stable gemini-3.1-flash-lite-preview)
9b18400View on GitHubfeat: implement dual-model fallback (gemma-4-31b-it + gemini-3.1-flash-lite-preview)
c1914a1View on GitHubUI Fix: Make Dashboard header sticky for better mobile accessibility
a7772dfView on GitHubUI Enhancement: Moved System Status to header and linked to UptimeRobot status page with modern responsive styling
0d6c167View on GitHubFinalize 8-node Agentic RAG: Stable Gemini model, Integrated graceful degradation logic, and Strict system prompts
050a90fView on GitHubAdd graceful_degradation_response function definition
de76d4cView on GitHubUpdate system prompt: Added Hallucination Guard instructions
24a6ce7View on GitHubUpdate system prompt: Added CHANGES vs CONTINUITY instruction
a66db68View on GitHubUpdate system prompt: Stricter MODE B (No Context) instructions
32df05dView on GitHubSwitch to gemini-3.1-flash-lite-preview model - final cleanup
7c3a338View on GitHubCLEANUP: Removed imports and regex, set model to gemma-2-27b-it
2119749View on GitHubUNDOING: Reverting to initial Gemini integration after failed experiments
8693e5dView on GitHubFix UnboundLocalError: initialize answer variable in call_llm
f66987bView on GitHub