Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

Utilizing Gesture Recognition with 3D Printed Prosthetics

Booth Id:

Robotics and Intelligent Machines


Finalist Names:
Li, Qingfeng (School: Laramie High School)

The number of amputees is rapidly increasing worldwide with more than a million amputations annually. Unfortunately, the usage rates of upper-limb prosthetics remain as low as 27% to 56% in some nations. Advanced prosthetics that use myoelectric control are often cost prohibitive, and less expensive 3D printed alternatives lack functionality. This project aims to create a prosthetic arm that is affordable and meets the functionality needs of upper-limb amputees. Affordability and functionality goals were met by designing a new prosthetic that combines flex sensors, 3D printing and accelerometer gesture recognition. Using accelerometer gesture recognition instead of buttons for prosthetic limb control provides the amputee freedom to use both upper limbs independently without relying on the functional limb to control the prosthetic. Furthermore, the prosthetic gestures and hand positions can be tailored to the amputeeā€™s individual needs. In this design, each accelerometer gesture corresponds to a unique setting which allows amputees to use different hand positions in different scenarios. A keypad enables amputees to record custom hand positions. The hand itself is controlled with flex sensors which serve as a cheap alternative to myoelectric sensors. Incorporating simple methods for hand position customization into a 3D printed shell makes this prosthetic design practical, functional, and cost effective. Additionally, the design can be easily modified through 3D printing technology. This allows the prosthetic to be refined to fit a child amputee as it is currently best suited for an adult.