Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

iAid: A Novel Multimodal, Cloud-Based Navigation System for the Visually Impaired

Booth Id:

Energy: Physical


Finalist Names:
Deans, Alexander

A novel, user-worn device (iAid) was devised to help visually impaired people navigate using an Arduino microprocessor and an Android platform. Voice commands were utilized to input the destination. Outdoor navigation was facilitated by a GPS/ compass controlled joystick which provided tactile feedback to the user. Using two servo motors, it tilted to indicate distance and rotated to show direction to the destination. Waypoints to the destination were determined using an Android-based platform and the Google Directions API, and were interfaced to the Arduino through Bluetooth, allowing turn-by-turn navigation. Four ultrasonic belt-mounted sensors scanned a horizontal 90° field, and obstacles within 60 cm triggered a high frequency piezo buzzer. Indoors, the GPS and compass were deactivated with a pushbutton, and the belt-mounted sensors rotated the joystick to show the most open path. 20 volunteers were tested navigating unaided, with the aid of a guide cane, and with the iAid. Compared to navigating unaided, the iAid reduced collision frequency by 80% and was 50% faster. Compared to navigating with a guide cane, collisions were reduced by 58% with the iAid, and navigation was 21% faster. The iAid was successfully tested outdoors with 23 different representative routes, ranging from 400m to 6 km. Eleven visually impaired patients successfully navigated an apartment complex with no major collisions. The iAid is the only voice-activated, mobile device providing tactile feedback that is adapted for practical use. It adapts to high-complexity environments, and plans a safe route through changing environments for the visually impaired.

Awards Won:
Second Award of $2,000