Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

Augmented Reality for Autism

Booth Id:
EBED007

Category:
Embedded Systems

Year:
2019

Finalist Names:
Manrique, Albert (School: MAST at FIU Biscayne Bay Campus)

Abstract:
The purpose of this project is to create affordable augmented reality glasses to help children with autism better recognize emotions. Research conducted by Stanford University indicates that this type of technology has the capacity to help children with mild cases of ASD to better read emotions. The glasses are constructed from safety glasses, a Raspberry Pi, a camera, an OLED display, a one-way mirror, and 3D printed enclosures designed on Fusion 360. The software, written in Python, allows for the emotion analysis of a picture taken from the camera on the glasses. The emotion analysis is computed with the Google Vision API. An OLED display is used to display the results of the emotional analysis. For a potential user to focus on the display, there needs to be 7cm between the potential user and the image. To achieve 7cm of distance, the display is placed above the potential user’s eye faced down, and the text on the display is reversed. A mirror is used to reflect the image of the screen to a potential user. The augmented reality glasses can take an image of a person in front of a potential user and analyze that person’s face. The glasses then present the potential user with the probabilities of joy, sorrow, surprise, and anger expressed by the person. In conclusion, these augmented reality glasses, which read emotions, can be manufactured at a lower cost. Further revisions of this technology will strive to increase speed and have a greater range of emotions.

Awards Won:
Florida Institute of Technology: Full Tuition Presidential Scholarship