Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

Visualization of Instructional Object-based on Artificial Intelligence Recognition and Human Interactions includes EEG, Voice and Hand Action

Booth Id:
SOFT021

Category:
Systems Software

Year:
2020

Finalist Names:
Kim, Hyogi (School: Ewha Womans University High School)

Abstract:
Due to Descartes’s coordinate system, the human could have shared the exact positioning information with each other. But in Personal communication or in the classroom it’s not easy to share the exact object with specific shape, color, and size instantly. The object can be geometric polygon and regional map or sometimes chemical symbols that are not easy to draw with chalk. In this study, I propose interactive object visualizing procedures by using human perceptual functions such as voice, hand actions, and brain waves. With a human voice, we can identify the category and object itself and its selection can be made with hand actions. Furthermore, the object can be modified by someone’s attention level of brain waves. As for the practical recognition functions, I use a convolutional neural network algorithm for voice recognition and color histogram filter for hand recognition. From my test, the object in which size less than 3% of the whole screen is filtered as a nose and if the color is not matched within preset hand color range the object is not recognized as a human hand. To obtain the modification value of the geometric object, the average level of the EEG attention value is calculated in every 3 seconds. By using this interactive visualization method, an instructor can project the measurable object upon the presentation board more precisely and the object can be displayed on the handheld screen to share the exact shape within any remote communicator. Through further studies, I expect these methodologies from my study can be used in diverse instructional applications as an objected-oriented communication protocol.