This project is a Hand Sign Recognition System using MediaPipe's hand tracking capabilities to detect and recognize hand gestures via a webcam.
- Train the system with custom hand signs.
- Store multiple samples per sign for better recognition accuracy.
- Predict hand signs in real-time based on trained data.
- Enter a Hand Sign Name in the input field.
- Click Train Now to set the hand sign name.
- Click Start Training to begin capturing hand sign samples.
- Click Stop Training when enough samples are captured.
- Repeat for multiple hand signs.
- Click Predict to switch to prediction mode.
- Show a trained hand sign to the camera.
- The system compares the detected landmarks with stored samples.
- The closest match is displayed on the screen.
- Input field for Hand Sign Name.
- Buttons for Train Now, Start Training, Stop Training, Predict.
- A canvas for displaying hand landmarks.
- A hidden video element for webcam input.
- A section to display trained hand signs.
- Training Mode: Captures hand landmarks and stores them under the given sign name.
- Prediction Mode: Compares detected hand landmarks with stored samples using Euclidean Distance.
- MediaPipe Integration: Uses MediaPipe's Hands API to detect hand keypoints and draw them on a canvas.
- HTML & CSS: UI structure and styling.
- JavaScript: Hand gesture recognition logic.
- MediaPipe Hands API: Real-time hand tracking.
- Bootstrap: Responsive design.
- Clone this repository:
git clone https://github.com/tron01/HandSignPrediction.git
- Open
index.htmlin a browser.
- Implement persistent storage for trained signs.
- Improve recognition accuracy with ML models.
- Add an export/import feature for trained gestures.
This project is open-source and available under the MIT License.