Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

Neural Action: A Real-time and Accurate Gaze Tracking Application for a More Natural Human-Computer Interaction with User Interfaces

Booth Id:

Robotics and Intelligent Machines


Finalist Names:
Ham, Jonghyeon (School: Korea Digital Media High School)
Lee, Heejun (School: Dongan High School)

The novel human-computer interaction tool developed in this study replaces the features of keyboard and mouse by using a highly accurate gaze tracker based on a neural network for general user interfaces (UI) and creates a practical environment with a unique keyboard layout, word autocomplete, and UI interaction.It enables users to operate a computer with just eye contact, leaving aside keyboard and mouse thus improving accessibility. This way, people can use ordinary applications without any problem, especially in the case of physically disabled people. Our gaze tracking technology used in this research backed by a convolutional neural network and has a variety of trained models for different platforms such as PC or mobile. It shows high accuracy (~ 1.7cm error), which is enough to operate actual UIs. Tracking speed is fast enough to enable real-time tracking on mobile (~ 100ms) and PC (~ 17ms) without any hardware acceleration. Furthermore, input of neural network only requires a single RGB camera, so gaze can be tracked without a special stereo camera or infrared camera. Our gaze tracking technology developed in this study can be used in mobile devices and on a PC as a cross-platform application, and a developed prototype application is working on PC.

Awards Won:
Fourth Award of $500