Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

Speaking Gloves: Cheap and Improved Wearable Gloves That Translate Sign Language Into Speech Using Velostat Based Sensors and Machine Learning

Booth Id:
EBED044T

Category:
Embedded Systems

Year:
2023

Finalist Names:
Fahmy, Mohamad (School: Obour STEM School)
Ramadan, Yousef (School: Obour STEM School)

Abstract:
The mute and deaf community relies on sign language for communication, unfortunately, the vast majority of the healthy population cannot understand sign language, which creates a communication barrier for the deaf and mute community. Our project aims to help the mute and deaf community communicate with their surrounding communities by providing a cheap, accurate, and reliable solution that can function anywhere and can translate sign language into speech. Our solution involves the construction of a smart pair of gloves that can measure both finger bends and hand motion to recognize and translate signs in the sign language. To measure finger bends, we used the special property of a material called Velostat, which changes its electrical resistance when it is bent or pressed, allowing us to determine the amount of bending on each finger of both gloves. For the hand motion, we used an accelerometer called ADXL345 which measures the acceleration of the hand, allowing us to determine the exact hand movements of the user. Our gloves initially suffered low accuracy where it could not differentiate between signs that are similar to each other. However, by changing the way of extracting data from the accelerometer sensor and by using a machine learning classification model to classify these extracted data, we were able to achieve much better results. Our results suggest that this system can be relied on for the recognition of all signs of the sign language including stationary signs and signs involving motion.

Awards Won:
Second Award of $2,000