Welcome to the Real-Time Hand Sign Detection repository! This project leverages deep learning to detect and recognize various hand signs in real-time, using a custom-trained YOLOX model. The system is designed for applications such as gesture-based interaction, sign language interpretation, and more.
- Features
- Demo
- Installation
- Usage
- Project Structure
- Model Training
- Auto Labeling Tool
- Future Work
- Contributing
- License
- Real-Time Detection: Uses a YOLOX model for accurate, real-time hand sign detection.
- Customizable and Scalable: Includes utilities for custom data labeling, model training, and evaluation.
- Auto Labeling Tool: A built-in tool for automatic data labeling to streamline the preparation of training datasets.
- Python-based Implementation: Easily customizable and extendable Python code.
A demo of the hand sign detection in action can be run by following the instructions below. The demo uses the Ninjutsu_demo.py
script for detecting hand signs in a live video feed.
To set up this project locally, follow these steps:
-
Clone the Repository
git clone https://github.com/Chowdhurynaseeh/Realtime-handsign-detection.git cd Realtime-handsign-detection
-
Install Dependencies Make sure to have Python 3.7+ installed. Then, install the required dependencies:
pip install -r requirements.txt
-
Download Model Weights Download the pre-trained YOLOX model weights from YOLOX GitHub and place them in the
model/yolox
directory.
To run the real-time hand sign detection demo:
python Ninjutsu_demo.py