Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

Project Vision: Assisted Navigation & Waypoint Positioning Through 3D Mapping

Booth Id:
EBED013

Category:
Embedded Systems

Year:
2024

Finalist Names:
Nagler, James (School: Garden City High School)

Abstract:
According to the World Health Organization (WHO), 2.2 billion people have some form of visual impairment, half of which could be improved by more affordable solutions. My project attempts to solve this problem by creating a revolutionary pair of ultra-compact and affordable glasses that can simultaneously map an environment, navigate around obstacles, position a user in 3D space, and effectively transmit this information to the user. Specifically, I tested for the accuracy of the 3D point cloud, real-time positioning, waypoint positioning, and time-of-flight (ToF) variability. The glasses are powered by four custom circuit boards, cutting-edge wide-angle ToF sensors, custom-coded novel positioning algorithms, vibration motors, bone conduction speakers, inertial measurement units (IMUs), and a 3D-printed frame. To accomplish real-time positioning and live 3D mapping, custom algorithms and the ToF sensors are used when an object is within a four-meter (16ft) radius of the person. The vibration motors on all sides of the frame notify the user of any incoming obstacles. A bone conduction speaker is also used to provide more descriptive feedback. Most notably, the headset can automatically determine waypoints within a room and autonomously navigate the user to a specific destination, such as a bed or bathroom, while accounting for any new obstacles in their path. If an object is outside of this radius, the device can still use the ToF sensors to alert the user of any obstacles and, if outdoors and connected to a phone, navigate the wearer to a specific destination using the speaker.