A research team at the University of Houston has developed an interface that allows a user to control a prosthetic limb purely with thoughts.
The technology uses noninvasive brain monitoring to capture brain activity related to the regions that control movement and grasping objects. The researchers created a brain-machine interface that processed the subject's intentions and allowed him to grasp an object with the prosthetic limb with accuracy 80 percent of the time, according to a news release.
The noninvasive method showed similar success rates to previous experiments with implanted electrodes or myoelectric control, which relies on electrical signals in the arm. The new method avoids the risks of surgical implantation and is available to more patients because it does not require the neural activity to be present in relevant muscles, according to the news release.
Brain-machine interfaces have the potential to improve prosthetics and offer insights into how the brain controls fine muscle control, according to the news release.
The research, published in the journal Frontiers of Neuroscience, includes an algorithm generated from the subject's electroencephalographic data as well as information about prosthetic hand movements taken from able-bodied volunteers, according to the news release.