Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

Artificial Sign Language: A Virtual Translator for American Sign Language (ASL) Users

Booth Id:
ROBO053T

Category:
Robotics and Intelligent Machines

Year:
2023

Finalist Names:
Lin, Xin Lei (School: Marianopolis College)
Shariff, Aly (School: Marianopolis College)

Abstract:
Through our design, we have employed two machine learning models. First, a visual transcript of hand signs, and other physical features must be established, and accurately return a string of words. Such a task has been realized by a tracking software, MediaPipe, whose data was later processed and inputted into a machine learning model. The software detects and extracts the coordinates of a person’s body position (e.g., hands, arms, face), which are then analyzed by a neural network. We have investigated many novel machine learning approaches, from transfer learning and convolutional neural networks (CNNs) to a combination of a Long-Short Term Memory, or Dense Neural Networks. These models have been trained through rigorously treated datasets, which effectively extracted the most important information out of videos which were collected through web-scraping using Python. They can effectively classify various videos of ASL gestures into English words. In addition, an algorithm based on graph theory has also been developed to make the program capable of distinguishing between various words signed in a video, a single take, therefore, enabling the translation of complete sentences. Yet, such outputted words are not grammatically legible for English speakers. A second natural language processing model and algorithm must therefore be created to palliate this translation barrier. The combination of such technologies may later be encapsulated in a mobile application, directly using a phone's camera, for real-time translation. The app would have a user-friendly interface and would include speech recognition to allow communication between non-ASL users and deaf people.