Artificial Intelligence Has Learned To "read Minds" In Whole Phrases - Alternative View

Artificial Intelligence Has Learned To "read Minds" In Whole Phrases - Alternative View
Artificial Intelligence Has Learned To "read Minds" In Whole Phrases - Alternative View

Video: Artificial Intelligence Has Learned To "read Minds" In Whole Phrases - Alternative View

Video: Artificial Intelligence Has Learned To
Video: Reading minds | Marvin Chun | TEDxYale 2024, April
Anonim

Scientists have created a system that decodes brain signals and turns them into speech. This way you can find out what the subject is saying without hearing his words. Over time, such devices can become speech synthesizers for the dumb.

The achievement is described in a scientific article published in the journal Nature Neuroscience.

For several years now, various research groups have been creating systems that transform the picture of brain activity into words, in fact replacing the human speech apparatus. However, previous works usually dealt with the connection of brain signals with the movements of the organs of speech or spoken sounds.

Researchers at the University of California, San Francisco decided to take a shortcut. Their design maps neural signals directly to phrases. To do this, they adapted the algorithms used in machine translation from one language to another. In this case, a similar problem is solved. The brain that perceives speech translates it into the language of its electrical activity. The task of artificial intelligence is to perform reverse translation.

The researchers adapted the approach used for machine translation to decode neural signals. Translation of Vesti. Nauka
The researchers adapted the approach used for machine translation to decode neural signals. Translation of Vesti. Nauka

The researchers adapted the approach used for machine translation to decode neural signals. Translation of Vesti. Nauka.

The experiment involved four women with epilepsy. Previously, doctors introduced them for medical purposes into the brain of about 250 electrodes, which now have served for the benefit of science.

All subjects read out a set of phrases. At this time, electrodes recorded the activity of their brain associated with lip movements, pronunciation of vowels and consonants, and other parameters.

Then three artificial neural networks came into play. The first highlighted patterns in how the signal from the electrodes changed over time. The second transformed the result of the first into special mathematical structures (templates). Finally, the third turned them into text in English.

Promotional video:

The participants in the experiment repeated each of the 30-50 sentences twice. The response of the nerve cells to the first utterance of the phrase served to train the system, and to the second utterance to test its skills.

As a result, the system was wrong only 3% of the time. This is a very good result for this kind of development.

True, the vocabulary of the phrases used was limited to 250 words. But, first of all, and this is a lot for people who today cannot pronounce or write anything at all due to paralysis. And secondly, the specialists continue their work.

Author: Anatoly Glyantsev

Recommended: