Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

A Hand-centric Gestural Interface for 3D Navigation and Interaction in Visualization

Booth Id:

Systems Software


Finalist Names:
Barish, Justin

Current methods of navigating and interacting with 3D visualizations utilize intrusive physical devices (e.g. keyboards, mice). The purpose of this project is to design and program a 3D interface offering navigational control via hand motions and interaction through gestures (3D hand motions). This interface was first designed and created for a CAVE Virtual Reality facility, and then adapted for desktop use. For the CAVE implementation, a glove and 3D glasses with markers attached to each are worn, so that infrared cameras sense and relay their position. A program was developed and written so that when an arm is extended, the scene moves in the direction indicated by the arm. Self-testing this developed interface against a gamepad showed its effectiveness. For the designed desktop interface, a Leap Motion provides hand positions as a fist is moved/rotated. The developed program calculates camera translations from the displacement between the starting and current fist positions, and calculates camera rotations from fist roll. To recognize performed gestures, the program adjusts the gesture by resampling, rotating, and rescaling the gesture's points, and compares it against stored gesture templates. Self-testing determined that the interface was effective, and that the recognizer’s average accuracy is 95.16% with a 4.6ms average recognition time.

Awards Won:
Serving Society Through Science: First Award of $500
Fourth Award of $500