Back to search
A project leveraging computer vision and machine learning techniques to interpret American Sign Language (ASL) hand gestures in real-time. Developed using the MediaPipe library for hand landmark detection and TensorFlow/Keras for gesture classification. Enhances accessibility and communication for the deaf and hard of hearing community.
Stars
2
Forks
0
Watchers
2
Open Issues
0
Overall repository health assessment
No package.json found
This might not be a Node.js project
3
commits