Booth Id:
EBED019
Category:
Embedded Systems
Year:
2018
Finalist Names:
Xu, David (School: Jericho High School)
Abstract:
Surface-free digitization of hand-drawn shapes, words, and images can be accomplished through the addition of inertial sensors to a pen-like device. In order to create such a device, a MPU6050 was attached to a Pilot Precise V5 pen, allowing for the recording of orientation and acceleration data. A connected Arduino UNO R3 transferred the motion data via serial USB communications to a computer, where block-based thresholds (measuring average acceleration magnitude, magnitude of average acceleration, and magnitude of individual component accelerations) were used for stationary classification for zero-velocity updates to filter and integrate the data to find displacement. Minimization of orthogonal distances was used to find a best-fit plane for the resulting pen movement path, reducing skew in the projected image by providing a reference for camera positioning. The projected image was aligned to the originally-drawn image and the Modified Hausdorff Distance was calculated as a percentage of average image dimension to compare recreation accuracies of different filtering criteria. One effective classification criteria used movement along any of the three component axes as an indication of overall movement. Not all recreated images were recognizable due to misclassification of stationary and moving periods. Accuracy was improved when pauses were added between segments of the drawing, indicating that changing block-based filtering parameters (criteria thresholds or block lengths) may further improve accuracy. A Bluetooth-enabled prototype of the device was also created. The developed technology could eventually allow for “in-the-air” drawing, visualization of baseball swings, and tracking of surgical instruments.