Triplett, Sam (School: McIntosh High School)
This project records the development and measures the accuracy of an accurate sensor that measures the relative angles of the primary joints contained in the digits of the human hand. Although there are already motion tracking techniques available for creating animations, they are impractical for consumer use because of their reliance on specific lighting and backgrounds. Therefore, a glove was equipped with flex sensors attached at key locations on the hand and then wired to an Arduino microcontroller which calculates the resistance of each flex sensor, one per finger and two per thumb. The Arduino then compiles the raw data and then sends the relative voltage readouts to a USB COM port that receives the data. From there, the resistances were recorded for several standard hand positions from which logarithmic, quadratic, and power regressions were calculated relating physical joint angles to the relevant sensor's resistance. These regressions were then implemented in a secondary program that takes in the raw data through a serial port continuously then displays a ragdoll-style hand animation following the predicted movement of the operator's hand. Although this 3D model is useful for visualizing the possible uses of a tool like this, the nature of the glove's output also makes it easily adaptable for other uses in controlling a robotic hand remotely to operate on a patient/delicate system or to manipulate objects in Virtual Reality environment without breaking immersion. In order to find the accuracy of the sensor, the regressions relating the tested joint positions and the predicted angles were reviewed to find a surprisingly high average of 95.4%, 85.7%, and 94.4% variation accounted by the quadratic, logarithmic, and exponential regressions respectively.