Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

Sign Language Translator

Booth Id:
SOFT022T

Category:
Systems Software

Year:
2019

Finalist Names:
Dikle, Zeynep (School: Nazmi Arikan Fen Bilimleri High School)
Mavi, Arda (School: Ayranci Anadolu Lisesi)

Abstract:
This study aimed to facilitate the communication of people who use sign language and those who do not know sign language. As a solution, mobile application was developed that converts the defined hand signs in American Sign Language (ASL) to sound in real time by using artificial intelligence. Two different datasets were created through the scope of the project: “Sign Language Digits Dataset” was created with the collaboration of 218 high school students. Consists of just ASL digits with good light and white background samples. This digits dataset can be found in GitHub and Kaggle. “Sign Language Dataset” was created with the collaboration of 173 high school students. Consists of ASL digits, some nouns and verbs samples without any light or environment dependency. Using this dataset prepared; a Convolutional Neural Network (CNN) based deep learning model that creates new data samples, a unsupervised machine learning algorithm that pre-labeling the unlabeled data samples and CNN based deep learning models to detecting sign language in real time. These machine learning algorithms were trained with created sign language dataset and deployed to mobile as a real time sign language converter application without any light or environment dependency. The prepared project has a dynamic structure so this structure allows the project to ability to implementation different problem areas (e.g. communication solutions for customer-oriented companies and control of robots).

Awards Won:
Fourth Award of $500