What Prevents Us From Being Objective: 11 Cognitive Biases - Alternative View

Table of contents:

What Prevents Us From Being Objective: 11 Cognitive Biases - Alternative View
What Prevents Us From Being Objective: 11 Cognitive Biases - Alternative View

Video: What Prevents Us From Being Objective: 11 Cognitive Biases - Alternative View

Video: What Prevents Us From Being Objective: 11 Cognitive Biases - Alternative View
Video: 12 Cognitive Biases Explained - How to Think Better and More Logically Removing Bias 2024, May
Anonim

Cognitive biases are systematic errors in human thinking, a kind of logical trap. In certain situations, we tend to act in irrational patterns, even when it seems to us that we are proceeding from common sense.

Under the cut, you will read 11 common pitfalls that rob us of objectivity.

Illusion of control

People tend to overestimate their influence on events in which they are interested in a successful outcome. This phenomenon was discovered in 1975 by the American psychologist Ellen Langer during experiments with lottery tickets. The participants in the experiment were divided into two groups: people from the first group could choose their own lottery tickets, and members of the second group were given out without the right to choose. 2 days before the drawing, the experimenters suggested that the participants of both groups exchange their ticket for another one, in a new lottery with greater chances of winning.

Obviously, the offer was profitable, but those participants who chose the tickets themselves were in no hurry to part with them - as if their personal choice of ticket could influence the probability of winning.

Zero risk preference

Promotional video:

Imagine that you have a choice: reduce the small risk to zero, or significantly reduce the high risk. For example, to bring plane crashes to zero or drastically reduce the number of car accidents. Which would you choose?

Based on the statistics, it would be more correct to choose the second option: the death rate from plane crashes is much lower than the death rate from car accidents - so in the end, such a choice will save many more lives. And yet research shows that most people choose the first option: zero risk in any area looks more reassuring, even if your chances of being a victim of a plane crash are negligible.

Selective perception

Let's say you don't trust GMOs. And if this topic worries you a lot, you probably read news and articles about genetically modified organisms. As you read, you become more and more convinced that you are right: the danger is present. But here's the catch - chances are that you pay much more attention to news that underpins your point of view than arguments in favor of GMOs. This means that you lose objectivity. This tendency for people to pay attention to information that is consistent with their expectations and ignore everything else is called selective perception.

Player error

A gambler's mistake most often lies in wait for gamblers. Many of them try to find a relationship between the probability of the desired outcome of some random event and its previous outcomes. The simplest example is with a coin toss: if it hits heads nine times in a row, most people will bet on heads next time, as if hitting heads too often increases the likelihood of hitting heads. But this is not so: in fact, the odds remain the same - 50/50.

Survivor bias

This logical trap was discovered during the Second World War, but you can fall into it in peacetime. During the war, the US military leadership decided to reduce the number of losses among bombers and issued an order: based on the results of the battles, find out on which parts of the aircraft it is necessary to strengthen the protection. They began to study the returning aircraft and found many holes in the wings and tail - and it was decided to strengthen these parts. At first glance, everything looked quite logical - but, fortunately, the observational statistician Abraham Wald came to the aid of the military. And he explained to them that they almost made a fatal mistake. Indeed, the holes in the returning planes carried information about their strengths, and not about their weaknesses. Airplanes "wounded" in other places - for example, the engine or the fuel tank - simply did not return from the battlefield.

The principle of wounded survivors is worth thinking about even now, when we are going to make hasty conclusions based on asymmetric information on any two groups.

The illusion of transparency

You are in a situation where it is necessary to lie. But how difficult it is to do it - it seems to you that they see through you and any involuntary movement will betray your insincerity. Sound familiar? This is the "illusion of transparency" - the tendency of people to overestimate the ability of others to understand their true motives and experiences.

In 1998, psychologists conducted an experiment with students at Cornell University. Individual students read the questions from the cards and answered them by telling the truth or lies, depending on the directions on the card. The audience was asked to determine when the speakers were lying, and the speakers were asked to rate their chances of fooling others. Half of the liars assumed that they would be figured out - in fact, the listeners exposed only a quarter. This means that liars greatly overestimated the discernment of their listeners.

Why is this happening? Most likely because we ourselves know too much about ourselves. And therefore we think that our knowledge is obvious to an external observer. However, the illusion of transparency also works in the opposite direction: we also overestimate our ability to recognize other people's lies.

Barnum effect

A common situation: a person reads and stumbles upon a horoscope. He, of course, does not believe in all these pseudosciences, but decides to read the horoscope purely for entertainment. But a strange thing: the characteristic of the sign suitable for him coincides very precisely with his own ideas about himself.

Such things happen even to skeptics: psychologists have called this phenomenon "the Barnum effect" - in honor of the American showman and dexterous manipulator of the 19th century Finneas Barnum. Most people tend to perceive rather general and vague descriptions as accurate descriptions of their personality. And, of course, the more positive the description, the more coincidences. Astrologers and fortune-tellers use this effect.

Self-fulfilling prophecy effect

Another cognitive distortion that works into the hands of diviners. Its essence is that a non-reflective prophecy that sounds convincing can cause people to involuntarily take steps to fulfill it. And in the end, the prophecy, which objectively had not so many chances to come true, suddenly turns out to be true.

The classic version of such a prophecy is described in the story of Alexander Green "Scarlet Sails". The inventor Egle predicts little Assol that when she grows up, the prince will come for her on a ship with scarlet sails. Assol fervently believes in the prediction and the whole city knows about it. And then Captain Gray, who fell in love with the girl, learns about the prophecy and decides to make Assol's dream come true. And in the end, Egle turns out to be right, although the happy ending in history was provided by far from fabulous mechanisms.

Fundamental attribution error

We tend to explain the behavior of other people by their personal qualities, and our actions - by objective circumstances, especially when it comes to some mistakes. For example, another person is likely to be late due to their lack of punctuality, and their lateness can always be explained by a ruined alarm clock or traffic jams. Moreover, we are talking not only about official excuses, but also about an internal vision of the situation - and this approach to business prevents us from taking responsibility for our actions. So those looking to improve themselves should be aware of the fundamental attribution error.

Moral Trust Effect

The journalist, known for his liberal views, fell for homophobia, the priest took a bribe, and the senator, who stands up for family values, was photographed in a strip bar. In these seemingly out of the ordinary cases, there is a sad pattern - it is called the "effect of moral trust." If a person develops a solid reputation as a “righteous man,” at some point he may have the illusion that he is truly sinless. And if he's so good, a little weakness won't change anything.

A cascade of available information

A cognitive distortion that all ideologues of the world owe their success to: collective belief in an idea becomes much more persuasive when the idea is repeated in public discourse. We often encounter him in conversations with grandmothers: many pensioners are confident in the truthfulness of everything that is often talked about on television. But the new generation is likely to feel this effect through Facebook.

Varlamova Daria