Over 360 million people across the world are deaf (328 million adults and 32 million children), however many of them benefit from the usage of sign language to communicate. While, previously, attempts had been made to develop an American Sign Language (ASL) to English translator, all of them employed external hardware to achieve their goals. This project served to create a mobile application, named iSight, that would help facilitate communications with those who speak ASL. After attempts with template matching and temporally-based analysis, it was determined that a non-temporal method of adaptive skin detection and contour analysis, with the Hu moments of the images being processed by a support vector machine, was best in analyzing hand gestures (as using it decreased the error of analysis by 30% on average). Due to limitations in mobile processing power, the application was divided into two parts: a client and a server. The server analyzed images input from the client application designed for the mobile iOS platform, then reported the results of analysis back to the client through a secure websocket connection. The results of testing showed an apt efficiency in reporting and analyzing the input data, with iSight being able to accurately identify both static and moving signs 60% of the time, on average. While the application is still a prototype, the benefits it could offer to communication and understanding are numerous.
Third Award of $1,000