HAL 9000 Will Never Appear: Emotions Are Not Programmable - Alternative View

Table of contents:

HAL 9000 Will Never Appear: Emotions Are Not Programmable - Alternative View
HAL 9000 Will Never Appear: Emotions Are Not Programmable - Alternative View

Video: HAL 9000 Will Never Appear: Emotions Are Not Programmable - Alternative View

Video: HAL 9000 Will Never Appear: Emotions Are Not Programmable - Alternative View
Video: HAL 9000: "I'm sorry Dave, I'm afraid I can't do that" 2024, September
Anonim

HAL 9000 is one of the most famous cinematic artificial intelligences. This superb form of intelligent computer malfunctioned on its way to Jupiter in Stanley Kubrick's iconic 2001 Space Odyssey, which is currently celebrating its 50th anniversary. HAL can speak, understands a person, their facial expressions, reads lips - and plays chess. Its superior computational ability is supported by unique human traits. He can interpret emotional behavior, reason and appreciate art.

By empowering HAL with emotion, writer Arthur Clarke and filmmaker Stanley Kubrick have made it one of the most human-like images of intelligent technology. In one of the most beautiful scenes in sci-fi movies, he says he is "scared" when mission commander David Bowman begins to shut down his memory modules after a series of murderous events.

HAL is programmed to provide assistance to the crew of the Discovery ship. He controls the ship with the support of his powerful artificial intelligence. But it soon becomes clear that he is very emotional - he can feel fear, sympathy, albeit a little. Science fiction science fiction, but such emotional artificial intelligence in our reality is simply impossible at the moment. Any depth of emotion and feeling that you find in modern technology will be completely fake.

“Perfect” artificial intelligence

When Bowman begins to manually tweak HAL functions in the movie, he asks him to stop, and when we see a startling destruction of the HAL's "mental" abilities, the AI tries to calm itself down by humming "Daisy Bell" - this is probably the first song that a computer wrote.

Essentially, viewers begin to feel that Bowman is killing HAL. The shutdown looks like revenge, especially after what has become known from the previous events of the film. But if HAL is capable of making emotional judgments, real-world AI will definitely be limited in reasoning and decision making. Moreover, despite the opinion of futurists, we will never be able to program emotions the way science fiction writers - the creators of HAL - did, because we do not understand them. Psychologists and neuroscientists are unequivocally trying to figure out how emotions interact with cognition, but have not yet been able to.

In one study with the participation of Chinese-English bilingualists, scientists studied how the emotional meaning of words can change unconscious mental processes. When participants were presented with positive and neutral words like “holiday” or “tree,” they unconsciously extracted Chinese word forms. But when words had negative connotations, such as "murder" or "rape," their brains blocked access to their native language - without their knowledge.

Promotional video:

Reasoning and emotion

On the other hand, we are well aware of how reasoning works. We can describe how we arrive at rational decisions, write rules, and turn those rules into process and code. But emotion remains a mysterious evolutionary legacy. Their source cannot be traced, it is so vast, and it is not just an attribute of the mind that can be introduced intentionally. To program something, you not only need to know how it works, but why. Reasoning has goals and objectives, emotions do not.

In 2015, a study was conducted with Bangor University students speaking Mandarin. They were offered to play a game with the opportunity to win money. In each round, they had to accept or leave the proposed bet on the screen - for example, 50% chance of getting 20 points, 50% chance of losing 100.

Scientists have suggested that the ability to speak their native language will add emotion to them and they will behave differently as if they were communicating in a second language, English. This is what happened: when the feedback was in native Chinese, subjects were 10% more likely to bet in the next round, regardless of risk. This shows that emotions influence reasoning.

Returning to AI, since emotions cannot be fully implemented in a program - no matter how complex it may be - the reasoning of a computer will never change under the pressure of its emotions.

One possible interpretation of HAL's strange "emotional" behavior is that it was programmed to simulate emotions in extreme situations, where it needed to manipulate people based on common sense, but appealing to their emotional self when the human mind tolerates fail. This is the only way to see a compelling simulation of emotion in these circumstances.

In my opinion, we will never create a car that can feel, hope, fear or truly rejoice. Any approximation will be a simulacrum, because a machine will never be a human, and emotions are by default the human part.