Artificial limbs have the potential to help people who lost a leg or an arm, to live and work normally. But, unfortunately, this technology has not reached the wanted goal of humanlike performance, yet. The main challenge it faces is to control the limb directly by thoughts (as normal limbs are controlled). So far, science has learned to scan brain activity and notice its patterns to use them on similar applications, but this method is not efficient when dealing with large quantities of data, and it results in poor performance on practice. On the other hand, Machine Learning field has grown rapidly, especially, Deep Learning, which proved its capabilities with big data and various applications. The aim of this research is to apply Deep Learning on Brain’s Electroencephalogram (EEG) readings while sensing muscles activity to create a Machine Learning Algorithm that learns brain activity (with respect to desired movements) to train a Neural Network (NN) that can be used, later, with a portable EEG headset, to control an artificial limb (or multiple ones), with close-to-natural performance. To test my hypothesis, I created a simple prototyping, machine-learning algorithm that detects simple hand movement (right and left) on prerecorded EEG readings. I used a dataset consisting of 560 11-channel readings, that was 6160 signals to analyze. The deep-learning-based algorithm worked well. This experiment proved my idea which can help many people world-wide, including the 20 million Americans who have lost a limb. And it, also, can be used for medical analysis, and Brain-Computer Interfaces to control smart devices.
GoDaddy: $1,500 Data Award