Multi-Channel Message Relay with AI Customer Service 🚀 A FastAPI-powered messaging system for Email ✉️, SMS 📲, and Voice 🎙️, featuring Redis scheduling ⏰, implemented with a custom Decorator Design Pattern 🎨 and IoC Container . It also includes an AI assistant leveraging LLM , RAG 📚, KG 🧠, and Speech-to-Text 🎤 throughout the project.
Stars
13
Forks
1
Watchers
13
Open Issues
0
Overall repository health assessment
No package.json found
This might not be a Node.js project
1.8k
commits
Merge pull request #31 from princee1/features/agentic
5d03cc6View on GitHubMerge pull request #30 from princee1/features/agentc-temp
3e840abView on GitHubBUG: trying to update the mongo database into a replica mode but keep getting error
1b39ee9View on GitHubFEAT: the goal of this commit was to reduce the number of network round trip when doing a credit purchase/refund by doing a redis transaction. this caused a refractor but should enhanced the performance
09f8c10View on GitHubFEAT: added a way to register multiple dep service, just by adding it self to the used_by_service of the needed mini service, so everything related to the status will still work as intended
1fe7a69View on GitHubMerge pull request #29 from princee1/features/agentic
4e1385dView on GitHubFEAT: added all models from supported providers to verify model existence upon usage
79e52c9View on GitHubFEAT: added parameter to the llm_model and agent_model to instanciate the chat model
01fd0efView on GitHubFEAT: added a safe payment, essentially the cost is an object that abstract the cost of a request, but the cost must be paid to the merchant which is the server, the server offers a safe payment where if it cant resolve the service it will assure credit consistency in certain instance. This force the money to be request to be paid before doing the service.
aa63ba0View on GitHubFEAT: added a payment method on the broker to do a error proof code after the request processed the payement if theres any error it will refund
053e547View on GitHubFEAT: added the complete architecture to send prompt using the agentic server, each agent mini service which is a mirror of the remote agent service will have the llm provider as a dep service. Why mirror? because when theres a change on the agent which abstract the prompt, the context retrieval method plus parameter is changed its need to be reflected on the agentic server as well because the request will come from the app server.
3de947fView on GitHub