In extreme situations deaf-mute people have some communicative difficulties because of the language barrier. For example, as occurs when a deaf-mute person cannot ask for help. We have designed software able to translate international alphabet gestures to text without using additional hardware. Our software was designed with a cross-platform language Python that can operate on smartphone, laptop etc. To detect a hand in an image that has colorful background, in various illumination conditions we used the fact that the hands of the person using gestural alphabet are moving continuously. Selecting the variable areas of the image, one can grab a region of the hand movement. Then the histogram of oriented gradients (HOG) can be calculated for the selected area. SVM classifier, trained on a series of several thousand images, recognizes HOG and makes a decision about the letter. Recognition stability was estimated by the fraction of true-positive responses in a series of gestures presented at various levels of illumination, as well as in the presence of background noise (clothing items, furnishings). Our algorithm can reach up to 95% stability of recognition under illumination from 700 lux (bright light in the room) to 150 lux (ambient light on a cloudy day). Our algorithm allows quick recognition of gestures under all conditions, thus expanding communication opportunities for people with disabilities. Furthermore, our method can be used for gesture recognition in children's education, creating game environment etc.
Fourth Award of $500