All our senses are formed in the brain. Regardless of the type of incoming information, be it the sounds of music, some kind of smells or visual images, all of them, in their essence, are just signals transmitted and decoded by specialized cells. Moreover, if these signals are not taken into account, then the brain does not directly contact the external environment in any way. And if so, then it is likely that we have the ability to form new ways of interaction of the brain with the outside world and transfer data directly.
Let's go back a couple of sentences. If all information is just incoming impulses, then why is vision so different from smell or taste? Why don't you ever confuse the visual beauty of a blossoming pine tree with a feta cheese flavor? Or rubbing sandpaper on your fingertips that smells like fresh espresso? It can be assumed that this has something to do with the structure of the brain: the regions involved in hearing are different from those that process data about visual images, and so on. But why, in this case, people who have lost, for example, vision, according to numerous studies, receive a "reorientation" of the visual zone to enhance other senses?
Thus, a hypothesis arose: internal subjective experience is determined by the structure of the data itself. In other words, the information itself, coming from, say, the retina, has a different structure than the data coming from the eardrum or receptors from the fingertips. The result is different feelings. It turns out that, in theory, we can form new ways to transfer information. It will not be like sight, hearing, taste, touch, or smell. It will be something completely new.
There are two ways to do this. The first is by implanting electrodes directly into the brain. The second is by receiving signals from the brain non-invasively. For example, using wearable devices. Imagine that you are wearing a bracelet with multiple vibration motors that stimulate different locations around your wrist to generate a stream of data. When we establish a clear relationship between information and touch, people can easily begin to recognize it. NeoSensory is doing something similar at the moment, creating vibrational neural interfaces. One of these developers are planning to present in the next 2019.
Based on a note from Stanford University Professor of Psychiatry and Behavioral Sciences, author of The Brain: The Story Of You, and NeoSensory co-founder David Eagleman. Published by Wired.
Vladimir Kuznetsov