Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

A New Paradigm for the Visually Impaired Spectrum based on a LiDAR System

Booth Id:
EBED012

Category:
Embedded Systems

Year:
2021

Finalist Names:
Krishnan, Nethra (School: Plano West Senior High School)

Abstract:
With the advent of driverless cars, distance sensing radars have become prominent. Can the user “feel” the distance and navigate? This project explores advanced technology for the visually impaired and attempts to use it to address the needs of that large group of people. Now, is it realistic to use the latest sensing technology and direct it towards assisting individuals with both loss of intact vision and hearing? One of the latest and precise distance sensing technology is that of LiDAR. LiDAR stands for Light Detection And Ranging. It is increasingly being used for Advanced Driver Assistance and self-driving cars, where precision, topography, and speed matter. Could these features be vital for a visually impaired patient, who can similarly map out the area around him or her? The concept is to sense the distance and provide two channels of feedback to the user namely audio, and haptics. Arduino code was used so that the Pulse width Modulation signal was inversely proportional to distance – i.e. the frequency was higher if the distance got shorter. In order to simulate daily life, objects of different sizes were measured at varying distances. The LIDAR has a range exceeding 200 cm making it suitable for environmental sensing. For short distances (< 50cm), there is a slight non-linearity, and the accuracy minimally lower. To improve accuracy, Riemman’s Sum has been implemented to decrease noise, as using the right-hand points on the sweep will allow the device to integrate each sector every ten seconds.