08 November 2013

Virtual brain-computer interface: a two-handed game

The neurocomputer interface was taught to cope with two hands

Kirill Stasevich, Compulenta

Millions of people are forced to live with paralyzed limbs, and so far the only thing that makes their fate easier is wheelchairs of varying degrees of advancement. But when neuroscientists created a neurocomputer interface in the early 2000s, paralyzed people began to hope that sensitivity and mobility could return to them, at least in the form of artificial limbs.

A neurocomputer interface is understood as a device that allows you to decipher neural signals of the brain related to some part of the body - say, an arm or leg. Very often, paralysis occurs after spinal cord injury, when nerve wires cannot transmit a signal to the performing muscle from the central nervous system. Over time, neuroscientists realized that it is not necessary to try to heal spinal cord injury (and it is not always possible) and that instead it is possible to transmit a signal directly from the brain using electronic chips. But to do this, you need to know which neurons control the limbs and what exactly they command their signals. And if you learn all this, you can teach the brain to feel an artificial leg or arm as its own.

Intensive research is being conducted in this area, and one of the leaders here is the group of Miguel Nicolelis from Duke University Medical School (USA). Mr. Nicolelis is a recognized authority in the decoding of neural signals and the creation of neurocomputer devices; not so long ago, for example, we reported on an experiment in which he and his group achieved inter–brain radio communication – so far, however, only between rats and not with one hundred percent efficiency. (Six months later, a similar experience was repeated on humans – VM.)

Miguel Nicolelis and his colleagues managed to create a device that allowed controlling the arm of a computer avatar directly, so to speak, from the brain: the signals of neurons were read by the device and transmitted to a virtual limb that moved in accordance with them. And since we said the word "avatar", it is impossible not to notice that all this is quite similar to the movie of the same name, with the only exception that the avatar in such experiments is still virtual, and the experiments themselves are performed on monkeys.

Having learned how to decode signals for one hand, the researchers set about developing a neurocomputer interface for two limbs at once. But the task turned out to be many orders of magnitude more difficult. In everyday life, of course, we use both hands (and legs) equally, but at the same time we do not realize how fine work our brain does to coordinate the movements of the limbs. And a neurocomputer device must somehow reproduce this "functionality".

The researchers recorded the activity of about 500 nerve cells from different areas of the frontal and parietal lobes of the cortex of both hemispheres of macaques. With the help of a special algorithm, the activity of neurons was compared with certain hand movements: for this, the decrypting program was trained on Rhesus, who either controlled virtual hands with a joystick, or simply passively observed the movements of the avatar.

In the journal Science Translational Medicine (Ifft et al., A Brain-Machine Interface Enables Bimanual Arm Movements in Monkeys), scientists note that in most neurons, the pattern of activity changed when both hands acted simultaneously. In other words, the coordinated work of the limbs is subject to other signals than the work of the hands separately.

Over time, macaques learned to move virtual hands, performing certain actions with their help. For example, they simultaneously touched some objects. That is, the interface really helped to carry out coordinated work through the "power of thought" alone. At the same time, according to scientists, there were some changes in the Rhesus brain, which indicated that virtual limbs were increasingly embedded in the brain's idea of its own body.

If two "one-handed" interfaces were connected to the brain of macaques for comparison, then coordination was low and noticeably inferior to the integrated "two-handed" device. And this once again confirms that the joint work of the limbs is not just the sum of the activities of the neurons of the right and left hands, but a qualitatively different principle of operation, and nothing can be achieved here by simply summing up the neurons.

Yes, this is not actually a neurocomputer prosthesis, but only a program that can become a translator between the brain and an artificial arm, however, I think the lion's share of work in this direction has already been done: in this case, it is much more difficult to understand what the brain wants than to explain it to a machine later.

Prepared based on the materials of Duke University Medical School: Monkeys Use Minds to Move Two Virtual Arms.

Portal "Eternal youth" http://vechnayamolodost.ru08.11.2013

Found a typo? Select it and press ctrl + enter Print version