ScienceDaily (Nov. 9, 2007) — Neuroscientists have significantly advanced brain-machine interface (BMI) technology to the point where severely handicapped people who cannot contract even one leg or arm muscle now can independently compose and send e-mails and operate a TV in their homes. They are using only their thoughts to execute these actions.
Thanks to the rapid pace of research on the BMI, one day these and other individuals may be able to feed themselves with a robotic arm and hand that moves according to their mental commands.
“Our work has shown how important the learning process is when using brain-controlled devices,” says Andrew Schwartz, PhD, of the University of Pittsburgh. “By permitting the subject to adaptively recode the generated neural activity, the overall performance of the device is dramatically increased.
“Furthermore, as we have progressed in this work, it has become apparent that the basic idea of ‘intention’ during learning is very important and can be addressed by the direct observation of the neuronal transformations taking place during this fundamental processing,” Schwartz says.
Among the research institutions conducting cutting-edge research on the BMI is the University of Pittsburgh, where scientists recently succeeded in developing the technology that allows a rhesus macaque monkey to mentally control a robotic arm to feed itself pieces of fruit. The robotic arm’s fast and smooth movements were triggered by electrical signals that were generated in the monkey’s brain when the animal thought about an action.
In previous studies, this lab developed the technology to tap a macaque monkey’s Motor cortical neural activity making it possible for the animal to use its thoughts to control a robotic arm to reach for food targets presented in 3D space.
In the Pittsburgh lab’s latest studies, macaque monkeys not only mentally guided a robotic arm to pieces of food but also opened and closed the robotic arm’s hand, or gripper, to retrieve them. Just by thinking about picking up and bringing the fruit to its mouth, the animal fed itself.
The monkey’s own arm and hand did not move while it manipulated the two-finger gripper at the end of the robotic arm. The animal used its own sight for feedback about the accuracy of the robotic arm’s actions as it mentally moved the gripper to within one-half centimeter of a piece of fruit.
“The monkey developed a great deal of skill using this physical device,” says Meel Velliste, PhD. “We are in the process of extending this type of control to a more sophisticated wrist and hand for the performance of dexterous tasks.”
Velliste and the other members of the Pittsburgh research team point out that imparting skill and dexterity to these devices will help amputees and paralyzed patients to perform everyday tasks.
The animal’s thoughts emitted electrical signals that were recorded by tiny electrodes that the scientists had implanted in the monkey’s motor cortex. A computer-decoding algorithm translated the signals into the robotic arm and gripper’s movements.
In another study, a Washington University School of Medicine research team has generated new information about a long-held theory about the separate functions and responsibilities of the left brain and the right brain. In the process, the researchers, led by Eric Leuthardt, PhD, and his graduate students Kimberly Wisneski and Nick Anderson, have applied their findings to a new neuroprosthetic strategy to improve the Rehabilitation of stroke and trauma victims who have suffered damage to either the right or left half of the brain.
“Classic understanding of brain function has asserted that one hemisphere, or one side of the brain, controls arm and leg movement on the opposite side of the body,” Wisneski explains.
The team’s new findings indicated that if the left hemisphere were damaged, the right side of the brain still had electrical signals that could be used to trigger right-sided arm and leg movement.
The scientists recorded the brain activity of six epilepsy patients in which electrodes were placed over the surface of their brain for reasons that were not connected to the purpose of the study. (The intracranial electrode arrays were implanted on the surface of each patient’s brain to locate the brain areas that were involved with the patient’s seizures.) “This access provided us with insights that could not be obtained using other methods,” Leuthardt says.
The team recorded electrocorticographic signals while each patient opened and closed his or her hands. These recordings revealed brain activity in the hemisphere on the same side of the body in which movement was occurring. These same-side signals occurred at a lower frequency than did the signals emitted in the hemisphere opposite to the moving side of the body.
In addition, these same-side signals were emitted in spatially distinct areas of the brain and earlier in time in comparison to the hemispheric signals recorded for opposite-side hand movement.
“This evidence demonstrates that the brain encodes information regarding planning for movements of the same-sided limb and that this information is encoded in a way that is unique from that corresponding to opposite-side limb movements,” Wisneski says.
The team next determined how these results could be used to improve the rehabilitation of stroke and brain injury patients. Their focus: the brain computer interface (BCI), an external device that was designed to benefit patients with spinal cord injury and other disorders that did not affect the brain. The BCI enables individuals to control with their thoughts alone a cursor on a computer screen, a wheelchair, or a robotic arm.
To benefit stroke and brain injury patients, the BCI would have to be adapted to respond to signals from only one side of the brain.
“To allow these patients to benefit from the use of a brain-computer interface, signals for control for two sides of the body must be acquired from the single functioning hemisphere alone,” Leuthardt says. “In this paradigm, one side of the body — the side opposite to the unaffected half of the brain — would be controlled through normal physiologic pathways, and the other side of the body — the side affected by the stroke and on the same side as the unaffected hemisphere — would be controlled through neuroprosthetic assistance using same-side signals from the undamaged hemisphere.”
Other scientists are studying the phenomenon in which neurons are active in the brain’s motor cortex, not only when an individual bends a leg but also when he or she observes other people while moving their legs. This neural mechanism may help explain the development of innate skills such as speech and new motor skills such as a golf swing.
Graduate student Dennis Tkach and colleagues at University of Chicago hope to tap this neural mechanism to modify BMI systems for use by people who are paralyzed from spinal cord injury or related trauma. Currently the BMI’s functioning depends on mathematical maps that connect brain cell activity to the action — arm or leg movement, for example — that the system is designed to replace.
Tkach says that the phenomenon of congruent neural activity may provide the mathematical maps of these paralyzed patients. “The existence of these neurons offers the means of creating this mapping by relating neural activity of the patient to an action observed by that patient,” he says. “The neural activity is congruent because the way that the neurons fire during observation of familiar action is the same as the way they fire when the individual is performing that same action.”
The University of Chicago study, which was conducted with rhesus monkeys, was the first to analyze a neural system that showed congruent activity with movement on a single cell level in the primary motor cortex.
The monkeys were trained to perform a video task in a two-dimensional, horizontal workspace located in front of them. They guided a circular cursor to a square target. Both the cursor and the target were projected onto the workspace. The animals controlled the cursor by moving an exoskeletal robot arm in which their active arm rested.
They were then trained to relax and watch a playback of the task they had just performed. During the playback, the monkeys saw either or both the target and the cursor on the screen.
“We varied visibility of the video task components in an attempt to gain a better understanding of what facilitates the neural congruency between observation and action,” Tkach says. “The study showed that the presence of the goal of an action bears a greater impact on the strength of this congruence, while the observation of the motion to this goal carries minimal importance.”
This result emphasized the importance of the goal as the facilitator of this action-like neural response, Tkach says.
The brain cell activity patterns were recorded from arrays of 100 electrodes surgically implanted in the monkeys’ motor cortical areas. Because of these arrays, Tkach was able to obtain simultaneous neural activity data from a population of single cells along with a more global neural signal. Analyzing the data, he noted that the activity patterns of the neurons during the observation period correlated highly with the cells’ activity patterns when the animal was using its right arm to guide the cursor.
“Our results lead us to believe that when presented with the observation of a familiar action the monkeys inadvertently generate a motor command that is very similar to one that would occur if the animal were to execute the behavior,” Tkach says. The congruence of this motor command to the “actual” one was not an all-or-nothing phenomenon but instead spanned a continuum that was contingent upon the components of the observed action that was present.
In other work, Wadsworth Center scientists in Albany, N.Y., have succeeded in developing a BCI that provided people who are severely disabled with the ability to use their personal computers. For example, they were able to word-process, send e-mail messages, and remotely turn on or off the lights or TV in their homes. In the future, even more environmental control options will be available, says Eric Sellers, PhD.
The Wadsworth Center BCI system enabled a scientist with advanced amyotrophic lateral sclerosis (ALS), to communicate by e-mail with his research team. “It has allowed him to continue to direct a highly successful NIH-funded medical research program,” Sellers says. “The initial results indicate that the BCI can function without close technical oversight and can improve communication ability and quality of life. This initial success suggests that a home BCI system can be of practical value for people with severe motor disabilities and that caregivers without special expertise can learn to support it.”
Five severely disabled people have participated in the Wadsworth research program that evaluates the center’s BCI system. The first participant, the 49-year-old scientist with ALS, has been unable to move any muscles in his body except for his eyes. For up to five to seven hours every day since February 2006, he has worn a simple electrode cap on his scalp that picks up the electrical activity generated by his brain. The cap recorded electroencephalographic (EEG, or brain wave) activity at eight scalp locations.
The user’s brain waves were translated into simulated keystrokes. Software developed at Wadsworth presented rows and columns of a 72-element, 8″ x 9″ matrix that flashed in random order while the user paid attention to the element that he or she wanted to select. The software recognized that element and executed the appropriate keystroke. With this design, the patient could use the entire keyboard.
Sellers says that caregivers and family members learned to place the electrode cap on patients’ scalps, enable the software, and generally maintain the system, which the researchers monitored remotely via data transferred weekly from patients’ homes to the lab. To date, a total of five people with ALS have used the Wadsworth system in their homes.
In addition, the Wadsworth Center team has tested protocols in the laboratory that extend BCI functionality to benefit people with limited eye mobility, poor visual acuity, or difficulty maintaining gaze, impairments that can occur with severe motor disorders such as ALS, brainstem stroke, or cerebral palsy. For these individuals, the scientists have been developing a BCI system that uses auditory rather than visual stimuli.
In the auditory BCI system, the rows and columns of a 6″ x 6″ matrix of 36 letters and numbers are represented by six environmental sounds. For each selection, the user paid attention to the sound representing the column or row containing the desired choice. Thus far, most of the people who tested this auditory system in the lab used it with accuracy sufficient to support effective BCI operation.
The researchers also have been developing a BCI system that uses sensorimotor rhythms (SMRs), oscillations in the EEG recorded from the scalp over the sensorimotor cortex. The SMRs provided simple communication capabilities, and the people learned to use SMRs to control a computer cursor in one or two dimensions.
Adapted from materials provided by Society for Neuroscience.