Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

A Deep Neural Architecture Search Net-Based Wearable Object Classification System for the Visually Impaired

Booth Id:
ROBO056

Category:
Robotics and Intelligent Machines

Year:
2023

Finalist Names:
Arvind, Aniketh (School: Hackley School)

Abstract:
The World Health Organization estimates that a staggering 2.2 billion individuals worldwide suffer from vision impairments, which drastically limit independence and quality of daily life. Although the field of machine learning has made significant strides in recent years, particularly in image classification, these advances have predominantly focused on tasks that are visual in nature, which can be challenging for vision-impacted individuals. Much work has been published on obstacle avoidance and large-object detection for the visually impaired. However, little has been done to aid them in better understanding complex indoor daily-living environments. For these reasons, this study develops and presents a wearable object classification system specifically designed to assist the visually impaired in identifying small tabletop objects commonly found in their surrounding indoor environments. Through transfer learning, the system uses a pretrained neural architecture search network called NASNet-Mobile and a custom image dataset to conduct highly effective small-object classification with model accuracies of over 90.00%. The proposed transfer-learning model is subsequently deployed on a wearable wrist device for real-world applicability. This study ultimately evaluates and demonstrates the system’s ability to accurately classify small tabletop objects using an eight-trial experiment that calculates the system’s average precision, recall, and F1 score to be 99.30%, 97.93%, and 98.61%, respectively. Overall, this system represents a significant step forward in the development of machine learning systems that constructively assist the visually impaired while simultaneously improving their daily independence and quality of life.