Description
The Hand Gesture Recognizer uses MediaPipe and OpenCV to detect and interpret hand gestures in real-time, enabling touchless control of applications like presentations. It integrates American Sign Language (ASL) gestures and utilizes CNN and transfer learning for accurate recognition, offering customizable gesture controls for various use cases.
Practical Use Case and User Story
As a presenter, I need the Hand Gesture Recognizer to control presentations through camera-based gesture detection. It should allow me to navigate slides, draw, erase, and move a pointer using hand movements, without the need for physical devices. The system’s real-time performance and customizable gesture mapping will enhance interaction in classrooms, business meetings, or smart home settings, offering a touchless and intuitive user experience.
Tech Stack Involved
MediaPipe (Google), OpenCV, Python, CNN and Transfer Learning, CVZone, American Sign Language (ASL) Dataset