Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

Mind Beacon: A Portable Spatial Recognition Device for the Visually Impaired Using a 3D Depth Sensor and Custom Tactile Display

Booth Id:
ROBO083

Category:
Robotics and Intelligent Machines

Year:
2022

Finalist Names:
Jun, Seoyoung (School: Thomas Jefferson High School for Science and Technology)

Abstract:
According to the WHO, there are currently 2.2 billion people who live with visual impairment in the world. A sighted person is able to take in 3D spatial information by taking the 3D location of an object after processing images in the visual cortex. The visually impaired attain 3D information from mainly their sense of touch. There have been many advancements in the field of aid devices for the visually impaired, but many of them are inconvenient as the user is only notified of immediate obstacles. This research project introduces a handheld device that gives 3D spatial geometric information to the visually impaired using a 3D depth sensor and an innovative tactile display. The tactile display consists of linear servo motors and pogo pins which move vertically according to the spatial geometry measured by the 3D depth sensor. Users are able to recognize the position and height of obstacles and virtually understand the 3D space around them through the tactile display. This project implements a prototype of a portable device, Mind Beacon, which can measure 3D space and stream spatial geometry into the tactile display. Mind Beacon consists of a RealSense D435i Sensor, Raspberry Pi-4, a newly built tactile display, a potentiometer, and rechargeable batteries. Mind Beacon operates independently without external computing or power supply and has successfully completed functional and performance tests. In conclusion, Mind Beacon introduces a new way for the visually impaired to receive 3D spatial information.

Awards Won:
Second Award of $2,000
Association for Computing Machinery: Fourth Award of $500