Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

Implementing New Methods of Prosthetic Control

Booth Id:
ROBO014

Category:
Robotics and Intelligent Machines

Year:
2017

Finalist Names:
Li, Qingfeng (School: DaVinci Academy of the Science and the Arts)

Abstract:
Close to fifty percent of upper-limb amputees do not use a prosthetic. The reason is that traditional prosthetic hands offer limited usage, while more advanced options are more costly. Amputees would be more likely to wear prosthetics that are both effective and inexpensive. 3D printing is one technology that is currently reducing the price of prosthetics. However, the problem is that the methods that we use to control these prosthetic hands remain either primitive or expensive. This project aims to solve this issue by creating a system that allows a user to have complete control over a prosthetic hand, and is also cheap to implement. One solution is to combine voice recognition, and gesture recognition. Voice recognition is useful because of its ease of use and gesture recognition is invaluable because of its reliability. A model prosthetic is developed to test the accuracy of these methods, as well as to demonstrate the amount of control that this system gave to a user. The main component of this prosthetic hand is a Raspberry Pi. The Raspberry Pi has the power to link several servos, a sensor, and a screen all together. It is also able to store commands and gestures which allow amputees to record custom commands which gives them more control over their prosthetic. The results show that the combination of the voice and gesture recognition is successful. With background noise, gesture recognition is more accurate, but without it, voice commands are superior due to their simplicity.