Dunning-Kruger Effect - Alternative View

Table of contents:

Dunning-Kruger Effect - Alternative View
Dunning-Kruger Effect - Alternative View

Video: Dunning-Kruger Effect - Alternative View

Video: Dunning-Kruger Effect - Alternative View
Video: The Dunning-Kruger Effect - Cognitive Bias - Why Incompetent People Think They Are Competent 2024, October
Anonim

In general, this is in simple words about the obvious, but still. In simple terms, it can be formulated something like this - a stupid person makes mistakes, but cannot realize his mistake due to his own stupidity.

This is the forgiven interpretation of cognitive bias that Justin Kruger and David Dunning described in 1999. The full wording is as follows: "People with a low level of qualifications make erroneous conclusions and make unsuccessful decisions, but are not able to realize their mistakes due to their low level of qualifications."

Failure to understand mistakes leads to a conviction in one's own righteousness, and, consequently, to an increase in self-confidence and an awareness of one's superiority. Thus, the Dunning-Kruger effect is a psychological paradox that we all often encounter in life: less competent people consider themselves professionals, and more competent people tend to doubt themselves and their abilities.

Image
Image

Dunning and Kruger cited the famous statements of Charles Darwin as the starting point of their research:

and Bertrand Russell:

And now it's a little more complicated, but in more detail …

We perceive the world around us with our senses. Everything that we see, hear and somehow feel in the form of a data stream enters our brain. The brain evaluates the data and we make a decision based on it. This decision determines our next steps.

If the heat receptors in the mouth send us a signal that we are drinking boiling water, we will spit it out. When we sense that someone is about to harm us, we prepare to defend ourselves. When, while driving, we see that the brake lights of the car driving in front of us come on, our foot will instantly switch from the gas pedal to the brake pedal.

The rules by which our brain makes decisions are called mental models. Mental models are ideas stored in our brain about how the world around us works.

For each of our mental models, it is necessary to determine how much it corresponds to reality. This correspondence can be designated as its objectivity. The idea that, by giving up a portion of ice cream, we will solve the problem of hunger in Africa, obviously has a very low measure of objectivity, but the likelihood that, having shot himself in the head, a person will die is very high, that is, it has a high measure of objectivity …

However, our brain tends to succumb to the so-called Dunning-Kruger effect. This means that there are mental models in our heads that we sincerely believe in, even if they do not correspond to reality. In other words, our subjective ideas sometimes replace objective reality for us. Recent studies have shown that some of our subjective ideas about the structure of the world evoked the same confidence as an objective fact of the type: 2 + 2 = 4, however, in absolute certainty, our brain is often mistaken.

One MacArthur Wheeler from Pittsburgh robbed two banks in broad daylight without any disguise. CCTV cameras recorded Wheeler's face, which enabled the police to quickly arrest him. The offender was shocked by his arrest. After the arrest, looking around in disbelief, he said: "I smeared my face with juice."

Thief Wheeler was convinced that by smearing his face (including his eyes) with lemon juice, he would become invisible to video cameras. He believed in it so much that, having smeared himself with juice, he went to rob banks without fear. What is an absolutely absurd model for us is an irrefutable truth for him. Wheeler imparted absolutely subjective confidence to his biased model. He was subject to the Dunning-Kruger effect.

Wheeler's Lemon Thief inspired researchers David Dunning and Justin Krueger to study this phenomenon more closely. The researchers were interested in the difference between a person's real abilities and his perception of these abilities. They hypothesized that a person with insufficient ability suffers from two types of difficulties:

  • due to his inability, he makes the wrong decisions (for example, having smeared himself with lemon juice, he goes to rob banks);
  • he is not able to realize that he made the wrong decision (Wheeler was not convinced of his inability to be "invisible" even the recordings of video cameras, which he called falsified).

The researchers tested the reliability of these hypotheses on an experimental group of people who first performed a test measuring their abilities in a certain area (logical thinking, grammar or sense of humor), then had to assume their level of knowledge and skills in this area.

The study found two interesting trends:

  • The least capable people (labeled as incompetent in the study) tended to overestimate their abilities significantly. In addition, the worse the abilities were, the more they rated themselves. For example, the more unbearable a person was, the more he thought he was funny. This fact has already been clearly formulated by Charles Darwin: "Ignorance often gives rise to confidence than knowledge";
  • The most capable (designated as competent) tended to underestimate their abilities. This is explained by the fact that if a task seems simple to a person, then he gets the feeling that this task will be simple for everyone else.

In the second part of the experiment, the subjects were given the opportunity to study the test results of the rest of the participants, followed by a second self-assessment.

Compared to the others, the competent realized that they were better than expected. Therefore, they adjusted their self-esteem and began to assess themselves more objectively.

Those who were incompetent, after contact with reality, did not change their biased self-assessment. They were unable to admit that the abilities of others were better than their own. As Forrest Gump1 used to say, "every fool is for a fool."

The protagonist of the novel of the same name by Winston Groom and the film by Robert Zemeckis is a man with mental retardation. - Approx. per.

The conclusion of the study is as follows: people who do not know do not know (do not realize) that they do not know. The incompetent has a tendency to significantly overestimate their own abilities, they cannot recognize the abilities of others and, when confronted with reality, do not change their assessment. For the sake of simplicity, let's say about people suffering from this problem that they have Dunning-Kruger (abbreviated as D-K). The study showed that people come to biased and erroneous conclusions, but their bias does not allow them to understand and admit it.

THE RESEARCH HAS BEEN SHOWN TWO MAIN TRENDS:

I. COMPETENT TENDENCY TO UNDERESTIMATE THEMSELVES

II. THE UNCOMPETENT TEND TO OVERVIEW THEMSELVES

The brain protects us with sweet ignorance

The fact that in the case of the Dunning-Kruger effect one could speak of a certain protective reaction of the human brain confirms a condition called anosognosia1. Let's give an example: a patient who has lost one of the limbs and suffers from anosognosia thinks that he still has this limb, and it is impossible to explain the opposite to him. When a doctor talks to a patient about his healthy left arm, the patient communicates normally. But as soon as it comes to the right hand, which he does not have, the patient pretends not to hear. Monitoring of brain activity showed that the patient does it unconsciously, his damaged brain blocks information indicating his own deficiency, even at a subconscious level. There were even cases when it was impossible to explain to a blind person that he was blind. This extreme case of anosognosia supports the theorythat our brains are able to ignore information that indicates our incompetence.

It was easier for the “lemon thief” brain to consider evidence as fictitious than to admit its own incompetence and bias.

At times, our brains, as in the case of anosognosia, reacts to information indicating that our mental models are wrong by simply ignoring it. Keeps us in a state of bias and sweet ignorance. What risk does this carry? Why should we strive for objectivity?