Cognitive Distortions Fool Our Heads - Alternative View

Cognitive Distortions Fool Our Heads - Alternative View
Cognitive Distortions Fool Our Heads - Alternative View

Video: Cognitive Distortions Fool Our Heads - Alternative View

Video: Cognitive Distortions Fool Our Heads - Alternative View
Video: Cognitive Distortions- Cognitive Behavioral Therapy Techniques 2024, May
Anonim

Science says we are programmed to deceive ourselves. Is there anything you can do about it?

I look at a photograph of myself 20 years older than I am now. No, this is not a twilight zone, rather I am trying to get rid of the bias in the present - the tendency of a person to weigh more carefully which of the two moments of the future, which is closer to the present. Many scientific studies have shown that this trend - also known as hyperbolic discounting - is stable and persistent.

Most of the emphasis is on money. When people are asked if they would prefer to get, say, $ 150 today or $ 180 in a month, most will choose the first option. And when asked if they will choose $ 150 in a year or $ 180 in 13 months, most are willing to wait a month for $ 30 on top.

Of course, the phenomenon of leaning to the present manifests itself not only in experiments, but also in the real world. In the United States, for example, people save very little money by the time they retire - even if they earn enough not to spend their entire salary on recurring expenses, and even if they work for one of these companies that invest additional funds in a retirement plan.

This state of affairs prompted a scientist named Hal Hershfield to experiment with photographs. Hershfield is a professor of marketing at the University of California, Los Angeles, whose research begins with the idea that there is no connection between us today and in the future. As a result, as he explained in a 2011 article, "saving money is like choosing between spending it today or giving it to a stranger many years later." This article describes an attempt by Hershfield and several of his colleagues to change this way of thinking. They asked students to look for a minute at digital representations of themselves as avatars showing what they would look like at age 70. The students were then asked what they would do if they suddenly received a thousand dollars. The students, looking into the eyes of their elder self, replied,that they will put an average of $ 172 in their retirement account, which is double the amount that members of the control group were willing to allocate only about $ 80.

I myself am no longer young - I’ve changed my seventh decade, if you’re interested - and therefore Hershfield gave me the opportunity to see the image not only of myself after 80 (with old spots, an exorbitantly asymmetrical face and wrinkles with a depth from a pothole on the roads of Manhattan), but also my daughter later decades. He explained that this should make me wonder how I would feel at the end of my life if my children were not taken care of?

When people hear the word prejudice, many, if not most, come to mind either racial prejudice or biased news coverage. The bias to the present, by contrast, is an example of cognitive distortion - a collection of faulty thinking patterns embedded in the human brain. The collection, I must say, is significant. The Wikipedia article entitled "List of cognitive distortions" contains 185 points: from the effect of the actor and the observer (the tendency to explain one's own mistakes by the peculiarities of the situation, and the troubles that happen to others - the negative qualities of their personality) to the Zeigarnik effect (when incomplete or interrupted actions are better remembered, than completed).

Some of the 185 points are dubious and even unscientific. The IKEA effect, for example, is defined as “the tendency of shoppers to disproportionately value the value (value) of goods they partly create themselves.” But about 100 distortions do exist, which has been proven many times, and can introduce significant confusion into our lives.

Promotional video:

The gambler's misconception gives us the absolute belief that if a coin lands tails five times in a row, it will most likely land up heads for the sixth time. In fact, the odds are always 50-50. Optimistic bias makes us continually underestimate the cost and duration of virtually every project we undertake. The affordability effect makes us think that, say, traveling by plane is more dangerous than traveling by car. From the point of view of our memory and imagination, the images of plane crashes are more vivid and dramatic, and therefore accessible to our consciousness to a much greater extent.

Anchoring effect - the tendency to rely too heavily on the first information presented in decisions, estimates or predictions, especially when presented in numerical form. It is for this reason that negotiators deliberately start with numbers that are too low or too high, because they know they will become an anchor in the context of subsequent deals. A prime example of the anchoring effect is an experiment in which participants were asked to observe a roulette wheel, which rolled either 10 or 65, and then guess the share of African countries in the UN. Those who saw that the wheel stops at 10 named an average of 25%, the rest - around 45%. (At the time of the experiment, the correct answer would be 28%.)

The effects of distortion are not limited to the context of the individual. Last year, US President Donald Trump decided to send more troops to Afghanistan, thereby making the sunk cost mistake. He said: "Our country must achieve an honorable and reliable result, worthy of the enormous sacrifices that have been made, especially in terms of human lives." This kind of thinking tells us to opt for an unsuccessful investment solely because of the funds we have already lost on it; eat up an unappetizing dish in a restaurant, because we paid for it; to wage a war doomed to failure because of the human and material resources invested in it. This kind of thinking can be considered deliberately useless.

If I had to pick one of the most common and destructive bias from the list, I would probably call confirmation bias. This tendency forces us to look for evidence to support pre-existing concepts in our minds, to consider facts and ideas that we encounter as further evidence, and discard / ignore any evidence in favor of any alternative point of view. Confirmation bias manifests itself most shamelessly in the current political divisions, where each side cannot allow the other to be in any way right.

Confirmation bias occurs in many other circumstances as well, sometimes with dire consequences. To quote a 2005 report on preparations for the war in Iraq: “When there was evidence that Iraq did not have [weapons of mass destruction], analysts, as a rule, did not take such information into account. Instead of weighing the evidence independently, analysts accepted the information that was consistent with the generally accepted theory, and rejected that contradicting it."

More or less clearly, the idea of cognitive biases and error heuristics - methods and general rules by which we make judgments and make predictions - were described in the 1970s by Amos Tversky and Daniel Kahneman, sociologists whose careers began in Israel and continued in the United States. States. They were the ones who conducted the experiment “the share of African countries in the UN”. Tversky died in 1996. And Kahneman won the 2002 Nobel Prize in economics for their collaborative work, summarized in his 2011 bestseller Think Slow, Decide Fast. The story of the controversial collaboration between Tversky and Kahneman is told in last year's book by Michael Lewis, The Canceled Project. And Lewis's earlier book, How Mathematics Changed the Most Popular Sports League in the World, was about volleyball manager Billy Bean's confrontation with the cognitive biases of old-school adherents, namely the fundamental attribution error when we place too much value on someone's behavior when evaluating someone's behavior. personal qualities and underestimate external factors, many of which are measured by statistics.

Brain / flickr.com, Kevin Hutchinson
Brain / flickr.com, Kevin Hutchinson

Brain / flickr.com, Kevin Hutchinson

Another key figure in this area is the University of Chicago economist Richard Thaler. One of the distortions with which it is associated more than others is the ownership effect, the overvaluation of a purchase immediately after purchase. In one experiment conducted by Thaler, Kahneman, and Jack Knetsch, half of the participants were given a mug and asked how much they would sell it for. The average response was $ 5.78. The rest said they would buy the same mug for an average of $ 2.21. Which contradicts classical economic theory: at a certain time, among a certain group of the population, a product has a market value that does not depend on whether someone owns it or not. Thaler received the Nobel Prize in Economics in 2017.

Most books and articles on cognitive biases, usually towards the end, contain a phrase similar to the thought from Thinking Slow, Solve Fast: “The question most often asked about cognitive illusions is whether they can be overcome … The ideological message … is disappointing."

Kahneman and others draw an analogy based on understanding the Mueller-Lyer illusion that arises when observing segments framed by arrows. It consists in the fact that the segment, framed by "tips", seems shorter than the segment framed by "tails", in fact, the length of both is the same. The fundamental point: even after measuring the lines and finding that they are equal, as well as listening to an explanation of the neurological background of the illusion, we still consider one line to be shorter than the other.

At least in the context of this optical illusion, our slow, analytical mind - what Kahneman calls "System 2" - will recognize the Mueller-Lyer trick and convince itself not to trust the perception of "System 1", which works automatically and very quickly. But in the real world, everything is not so simple, because we are not dealing with segments, but with people and situations. “Unfortunately, this smart procedure is unlikely to be done where it is needed most,” Kahneman writes. "Everyone would like the warning bell to ring loudly every time we face an error, but there is no such call."

Because distortions seem so ingrained and unchanging, much of the attention given to countering them has not been about the problematic thoughts, judgments or predictions themselves, but about behavioral changes in the form of incentives and impulses. For example, while the current bias has so far been formidable, employers have been able to nudge employees to fund retirement plans by making savings the default; vigorous efforts must be made not to participate. That is, laziness and inaction can be stronger than prejudice. The procedure can also be organized in such a way as to dissuade or prevent people from being guided in their actions by bias. A well-known example is the checklists for doctors and nurses presented by Atul Gawande in the book “Checklist. How to avoid stupid mistakesleading to fatal consequences."

Is it true, however, that prejudice cannot be eliminated or significantly mitigated? Several studies have provided a preliminary answer to this question. The experiments are based on the reactions and responses of randomly selected subjects, many of whom are students, that is, they are more concerned with the $ 20 they will be paid to participate, rather than changing or learning behavior and thinking. But what if the person undergoing anti-prejudice measures is determined and has expressed a desire to participate himself? In other words, what if it was me?

Naturally, I wrote to Kahneman, who, at 84, still works at Princeton University's Woodrow Wilson School of Public and International Affairs, but spends most of his time in Manhattan. He quickly answered me, agreeing to meet. "I must," he said, "at least try to dissuade you from what was planned."

I met with the professor at the Daily Bread Cafe in lower Manhattan. He turned out to be a tall, courteous and affable man with a pronounced accent and a crooked smile. Over apple pie and tea with milk, he told me: “Temperament has a lot to do with my position. You will not find anyone more pessimistic than me”.

Brain neurons / AP Photo, Heather de Rivera / McCarroll Lab / Harvard via AP
Brain neurons / AP Photo, Heather de Rivera / McCarroll Lab / Harvard via AP

Brain neurons / AP Photo, Heather de Rivera / McCarroll Lab / Harvard via AP.

In this context, his pessimism is associated, firstly, with the impossibility of making any changes to "System 1", that part of our brain, which is distinguished by the speed of thought and makes erroneous judgments, equivalent to the Mueller-Lyer illusion. “I see uneven lines,” he said. - The goal is not to trust what I think I see. To understand that you shouldn't believe your eyes”. This is real in the case of optical illusion, he noted, but extremely difficult in the context of real cognitive distortions.

According to Kahneman, the most effective way to check them is from the outside: others are able to perceive our mistakes more easily than ourselves. And “slow-thinking systems,” as he put it, can define strategies that involve controlling individual decisions and predictions. It may also require checklists and "premortems," an idea and term coined by cognitive psychologist Gary Klein. Premortem is an attempt to counter the optimistic bias where team members are asked to imagine that a project will fail and write a couple of proposals about it. This exercise has proven to help people think ahead.

“My position is that none of this has any effect on System 1,” Kahneman said. “You cannot improve your intuition. Perhaps with long-term training, lots of conversations, and exposure to behavioral economics, you can justify the initial incentive and oblige "System 2" to comply with the rules. Unfortunately, for most people, in the heat of controversy and discussion, the rules go away like smoke."

During our communication with Kahneman, he corresponded in parallel with Richard Nisbett, professor of social psychology at the University of Michigan. They have maintained a professional relationship for decades. Nisbett was instrumental in spreading the thoughts of Kahneman and Tversky in a 1980 book titled Human Inference: Strategies and Deficiencies in Social Judgment. Kahneman, for his part, in Think Slow, Decide Fast, describes an early article by Nisbett that showed subjects' unwillingness to believe statistical and other generally accepted data, and a tendency to base their judgments on isolated examples and illustrative realities. (This distortion is known as ignoring basic information.)

However, over time, Nisbett's research and thought has begun to emphasize the possibility of teaching people to overcome or avoid a number of pitfalls, including ignoring basic information, fundamental attribution error, and sunk cost error. He emailed Kahneman in part because he was working on a memoir and wanted to discuss a recent conference call with Kahneman and Tversky. Nisbett got the impression that his colleagues were angry, taking all his words and actions to covert criticism. Kahneman recalled that conversation and replied, "Yes, I remember we were (somewhat) annoyed with your work on the ease of teaching statistical intuitions (and the anger is extremely strong)."

When Nisbett is asked for an example of his approach, he usually refers to baseball research. Students at the University of Michigan were called under the pretext of conducting a sports survey and asked why there are always multiple butters (hitters) in the Major Leagues who hit an average of 450 hits at the start of the season, but no one finished the season with such a high average. About half of the students who did not have an Introduction to Statistics subject give incorrect answers like “pitchers get used to butters,” “butters get tired as the season progresses,” and so on. And about half gives the correct answer, citing the law of large numbers, which states that outliers are much more common with a small sample size. Throughout the season, as the number of hits to the beat increases,regression to the mean is inevitable. When Nisbett asks the same question to students familiar with the statistics course, about 70% are correct. Contrary to Kahneman's opinion, he believes that such a result indicates the possibility of including the law of large numbers in "System 2" - and, perhaps, in "System 1", in the presence of even minimal incentives.

Nisbett's second favorite example is that economists who have learned the lessons of the sunk cost error often walk away from bad films and eat up bad food in a restaurant.

I spoke to Nisbett on the phone and asked about the controversy with Kahneman. His voice sounded a little shaky to me. “Danny seemed convinced that what I was demonstrating was trivial,” he said. “One thing was clear to him: learning is hopeless for any kind of judgment. But we have tested students at the University of Michigan for four years and they are doing great in problem solving.”

In his 2015 book Brain Accelerators. Learning to Think Effectively Using Techniques from Different Sciences "Nisbett writes:" Through my own research on teaching people to think statistically, I know that just a few examples in two or three areas are enough to improve people's motivation in an infinite number of events."

In one of his emails to Nisbett, Kahneman suggested that much of the difference between their positions depends on temperament: pessimist versus optimist. In response, Nisbett suggested another factor: “You and Amos specialized in complex problems, in the course of which you came up with the wrong answers. I started to study easy problems, in which you guys can never be wrong, but ordinary people will definitely be wrong … Then you can consider the impact of learning on easy problems, and they turn out to be huge in the end."

An example of a simple problem is the butter at the start of the baseball season. An example of a difficult one is the so-called "Linda problem", which formed the basis of one of the early articles of Kahneman and Tversky. In simple terms, during the experiment, the participants were told about a fictional woman named Linda, her ideas about social justice, her college studies in philosophy, participating in demonstrations against the use of atomic energy, and so on. Subjects were then asked which of the following was more likely: a) that Linda was a bank teller or b) that she was a bank teller and a feminist activist. The correct answer is (a), since the fulfillment of one condition is always more likely than two at once. But due to a conjunction error (assumptions aboutthat several certain conditions are more likely than one common one) and the heuristic of representativeness (our passionate desire to use stereotypes) more than 80% of the students surveyed chose the answer (b).

Nisbett rightly asks how often in real life we have to make judgments like the one required in the "Linda problem." I can't remember a single situation in my life. This is such a logical trick.

Nisbett suggested that I take the online course Brain Accelerators. Critical Thinking in the Information Age”, in which he talks about what he considers to be the most effective skills and concepts for getting rid of judgment bias. Then I was asked to answer the questions that he asks students at the University of Michigan. So I did.

The course consists of eight lessons with graphs and quizzes, and Nisbett himself appears on screen as a reputable yet responsive professor of psychology. Recomend for everybody. He explains the availability heuristic in this way: “People are surprised that there are more suicides than homicides and more drowning deaths than fires. People are always confident in the growth of crime,”even if it is not so.

Passers-by on the street in Madrid / AP Photo, Andres Kudacki
Passers-by on the street in Madrid / AP Photo, Andres Kudacki

Passers-by on the street in Madrid / AP Photo, Andres Kudacki.

He addresses the logical fallacy of confirmation bias tendencies and explains that when testing a hypothesis that is believed to be true, people tend to look for examples that support it. But Nisbett demonstrates that no matter how many examples we collect, we can never prove this claim. It would be more correct to look for a refutation.

And he approaches the problem of ignoring basic information with the help of his own strategy of choosing films for viewing. His decision never depends on advertising, specific reviews, or an impressive name. “I am guided by the basic information and choose those books and films that are recommended by people I trust. he says. - Most consider themselves not like everyone else. But this is not so."

After I finished his course, Nisbett sent me a questionnaire that he and his colleagues use at the University of Michigan. It describes several dozen problems to measure the resilience of subjects to cognitive biases. One example is presented here.

Due to confirmation bias, many untrained people respond with (e). But the correct answer is ©. The only possible solution in this situation is to refute the rule, and the only way to do this is to turn over the cards with the letter "A" (the rule will be refuted if something other than four appears on the other side) and the number 7 (the rule will the other side will be the letter "A").

I got the point right, and Nisbett, when he got my answer, wrote: “Very few undergraduates at the University of Michigan have done as well as you, if any, have done. I am sure that at least a few second-year and higher psychology students showed the same result. Your result is close to ideal."

However, I didn’t feel that by reading Brain Boosters and taking an online course, I got rid of my biases. First, I was not tested ahead of schedule, so I could be considered relatively impartial. On the other hand, many of the test questions, including the one above, seemed to me far from the situations that might be encountered in everyday life. Besides, I was, in Kahneman's words, "savvy." Unlike undergraduates at the University of Michigan, I knew exactly why I was being asked these questions and approached them accordingly.

Nisbett, for his part, insisted that the results were quite demonstrative. "Once you achieve results in the context of testing," he told me, "you will achieve them in the real world."

Nisbett's course and the opportunity given by Hal Hershfield to face an older version of himself are not the only methods of getting rid of evaluative bias. The New York Institute for Neurological Leadership offers organizations and individuals a variety of training, webinars, and conferences promising, among other things, to use cognitive psychology to educate participants to counter bias. This year the two-day summit will take place in New York; by paying $ 2,845, you might, for example, find out, "Why is our brain so weak at thinking about the future, and how can we improve it?"

Professor of the Wharton School of Business at the University of Pennsylvania, Philip Tetlock, and his wife and research partner Barbara Mellers have spent many years studying so-called "super-forecasting" and super-predictors - people who manage to bypass cognitive biases and predict future events with much more accuracy than they appear on TV. scientists are the so-called experts. In Tetlock's book Super Forecasters: The Art and Science of Forecasting (co-authored with Dan Gardner), and through the Good Judgment business he and Mellers founded, the professor shares the know-how of super forecasters.

One of the most important factors is what Tetlock calls "the outside view." The insider's perspective, in turn, is the result of a fundamental attribution error, ignorance of basic information, and other biases that continually lead us to rely not on data and statistics for our judgments and predictions, but on good examples. Tetlock explains, “At someone's wedding, they come up to you and ask how long you think this marriage will last. If you are surprised because you know that the newlyweds really love each other, then you are drawn into the inside look. " About 40% of marriages end in divorce, and this statistic is much more indicative from the point of view of the fate of any particular marriage than enthusiastic looks. This understanding is not worth sharing at the ceremony.

Among recent methods of getting rid of predisposition to certain estimates, scientists have considered a number of video games to be the most promising. They emerged after the war in Iraq and the resulting disastrous weapons of mass destruction miscalculation that shook the foundations of the entire intelligence community. In 2006, in an effort to prevent another error of a similar magnitude, the US government created the Advanced Intelligence Research Agency (IARPA) to use the latest technology to improve the collection and analysis of intelligence data. In 2011, IARPA developed a program called Sirius to fund the development of "serious" video games that could mitigate or combat what are considered the six most harmful cognitive biases: confirmation bias, fundamental attribution bias.blind spot for cognitive biases (the tendency to not compensate for your own biases in any way), anchoring effect, representativeness heuristic and projective distortion (the assumption that the thinking of others is identical to your own).

The goal of developing such games was set by six teams, but only two reached the finish line. The one led by Carey Morewedge, now a professor at Boston University, received the most attention. Together with the employees of the companies "Creative Technologies" and "Leidos", Morewedge has developed the game "Missing" (Missing). Some subjects played a game that took about three hours to complete, while others watched a video of cognitive distortion. All were tested for mitigation skills before training, immediately after training, and finally after 8-12 weeks.

After passing the test, I played a game where there are many men and women in tight clothes and not the best way to navigate the world around them. The player adopts the image of a neighbor of a woman named Terry Hughes, who mysteriously disappeared without a trace in the first part of the game. In the second, she reappeared and needs help to deal with the machinations taking place in her company. Along the way, you are asked to make judgments and predictions, some of which are history-related and some not, all of which are designed to expose your arsenal of biases. In accordance with your answers, you immediately receive comments and comments.

The human brain on display in Sao Paulo / AFP 2017, Mauricio Lima
The human brain on display in Sao Paulo / AFP 2017, Mauricio Lima

The human brain on display in Sao Paulo / AFP 2017, Mauricio Lima.

For example, when you ransack Terry's apartment, the building superintendent knocks on the door and for no reason asks about Mary, another tenant whom he describes as an unsportsmanlike person. He says 70% of tenants go to Rocky's Gym, 10% to Entropy Fitness, and the remaining 20% stay at home and watch Netflix. What do you think, he asks, which gym is Mary most likely to go to? An incorrect answer based on ignoring basic information (a form of the representativeness heuristic) is “None. Mary is a couch fighter. Correct answer based on data courtesy of the Commandant: Rocky's Gym. When participants in the study were interviewed immediately after playing a game or watching a video, and then after a couple of months, everyone improved their score, with the players who played it better.

When I spoke to Morewedge, he said that he considered the results to be a confirmation of the research and ideas of Richard Nisbett. “Nisbett’s work is largely based on the assumption that learning fails to reduce bias toward certain grades,” he told me. - Literature on different kinds of teaching methods says that books and classes are essentially ineffective, although they are great entertainment. But the impact of the game is much greater. It surprised everyone."

Shortly after finishing the game, I took the test again and showed mixed results. There was a marked improvement in terms of confirmation bias, fundamental attribution error, and representativeness heuristics, and little improvement in the context of the blind spot in terms of cognitive biases and anchoring effect. The lowest initial score (44.8%) I showed in projective distortion. After the game, he even dropped a little. (I really need to stop thinking that everyone is thinking exactly like me.) But even the positive results reminded me of Daniel Kahneman's words: “The surveys don't convince me. The test can be given even after a couple of years, but after all, it gives the tested person clues and reminds what it is all about."

I passed the Nisbett and Morewedge tests on a computer monitor, not on paper, but the essence remains the same. It's one thing when the effect of learning appears in the form of improved test results, and quite another when the effect manifests itself in the form of behavior in real conditions. Morewedge told me that some tentative ways of solving problems of real difficulty during the game of "Missing" have provided "promising results", but it is too early to talk about them.

I am not as pessimistic as Daniel Kahneman and not as optimistic as Richard Nisbett, but I have noticed a few changes in my behavior. For example, recently it was hot, and I decided to buy a bottle of water for two dollars from the vending machine. The bottle did not fall out due to a broken mechanism holding it in place. However, there was another row of water bottles nearby, and, obviously, in this row the mechanism was in good order. My instinct was not to buy a bottle from the "good" tier, since it cost twice as much. But all my knowledge of cognitive biases said that this thinking was wrong. I would spend two dollars for the water - the price I was willing to pay, as already stated. So I deposited money, got water, and happily drank it.

In the future, I will make every effort to monitor my thoughts and reactions. Let's say I'm looking for a research assistant. Candidate A has impeccable references and experience, but turns out to be tongue-tied and unable to make eye contact; Candidate B loves to talk about basketball - my favorite topic! - but his recommendations are at least mediocre. Will I be able to overcome the fundamental attribution error and hire candidate A?

Or, say, there is a certain official whom I despise because of his character, behavior and ideology, but under his leadership the country's economy is highly efficient. Will I be able to get rid of the strong confirmation bias and at least admit the possibility that this person is trustworthy?

Regarding the issue that Hal Hershfield raised in the first place - planning inheritance - I have always been like ants stocking up for the winter while the grasshoppers are singing and frolicking. But I know how to save as well as to put everything on the back burner. A few months ago, my financial advisor offered to appraise my will free of charge, which was made a couple of decades ago and was definitely in need of revision. There is something about the process of making a will that creates a storm of prejudice, from the effect of ambiguity (when "decision-making suffers from lack of information or ambiguity," according to Wikipedia's definition) to the distortion of normality (the tendency to forget about self-preservation in an emergency),and the culmination of all is the ostrich effect (really needs to be explained?). My advisor sent me a prepaid letter, which is still lying on the floor of my office, collecting dust. And as my tendency toward belated judgments tells me, I knew that was exactly what would happen.

Ben Yagoda

Recommended: