Applications are now open for the 2024 Braver Angels Convention. Apply Now.
Apply now for the 2024 Braver Angels Convention.

Mistakes were Made, But Not By Me: A Look at Some Common Cognitive Errors

Facebook
Twitter
Reddit
Email

When I started researching the different types of cognitive errors to which we fallible humans are susceptible, I had a hypothesis: Blues don’t make cognitive errors. Reds do.

Oh, the irony. My presumption that my team of liberal and progressive Blues is free of cognitive bias was itself a trifecta of conformity, availability and confirmation bias.

Cognitive errors arise innocently and are unrelated to intelligence. In many cases, cognitive errors are our intuition gone awry. While intuition often serves us well in daily life, in complex situations that call for slow, careful deliberation and reasoning, relying on intuition alone can lead to the wrong conclusion.[i]

We all make cognitive errors, although depending on our media choices and underlying belief structures, we may be more prone to some types of errors than others. Cognitive errors come in dozens of flavors. In this article, I’ll outline a few that I see as playing a particularly strong role in the formation and durability of political opinions.

Confirmation bias (the mother of all cognitive errors.) Humans look for evidence that confirms their beliefs, ignoring or discounting evidence that does not. Countless experiments have shown this to be true. Recently, neuroscientists have discovered that consuming or sharing information that confirms our beliefs delivers a highly pleasurable hit of dopamine. Facebook was likely aware of this when it rolled out its ‘Like’ and ‘Share’ features.

There’s a famous experiment involving two sets of Stanford students, one pro-death penalty and one against. The students read two fabricated studies: one that demonstrated that the death penalty has a deterrent effect and one that said it does not. After reading the same two studies, the students who were already pro-death penalty became more adamantly in favor of it, and the students who were already against it became even more strongly opposed. Being exposed to beliefs that challenge our worldview is uncomfortable to most people and intolerable to many, and we respond by denying, ignoring, and/or forgetting the new information.

Correlation versus causation. If (a) my city bans bump stocks and (b) gun violence drops by ten percent over the next year, it would be very tempting to conclude that (a) caused (b). And maybe it did—but without further investigation, we don’t know whether the relationship between the two is causative or a mere correlation.

Future discount/present bias. Although it doesn’t always seem like it when we sit and try to meditate, we are preoccupied with the present. We want good health, security, and creature comforts, and we want them now. What’s happening right now is viscerally real compared to imagining a remote future. Present bias leads people to under-save for retirement and to neglect good nutrition and exercise. It also shows up in failure to reckon with the long-term impacts of burning fossil fuels.

Headwind–tailwind asymmetry. The human brain is Velcro for negative experiences. It makes sense from an evolutionary perspective—you definitely want to remember which plant made you puke. But in the modern version, people tend to be more in touch with all of the hard knocks that life has dealt them, and forget about how good luck or help from others has benefited them. If people don’t recognize the ways in they benefit from government services, they might resent paying taxes. Headwind-tailwind asymmetry can also blind wealthy people to the advantages of an upper-class upbringing.

Conformity bias. Were you ever 14 years old? Good, then I don’t have to explain this one, but here’s a jaw-dropping experiment that shows how strongly conformity bias affects even adults: The subjects examined drawings of 3D objects and identified whether the objects were the same or different. When actors in the room gave intentionally wrong answers, the subjects gave the wrong answer 41% of the time—even though the correct answer was obvious. When people overrode the group and gave the correct answer, brain scans showed their amygdalas lighting up. Nonconformity is scary, even in a no-stakes situation with people you’ll likely never see again.

One fascinating manifestation of conformity bias is when someone consciously says prejudiced things they don’t truly believe for the sake of fitting in. Conversely, sometimes people refrain from expressing their prejudices for the sake of fitting in. Many researchers believe that people’s attitudes toward “other” groups have less to do with their experiences with members of those groups than with the prevailing attitudes among members of their own social group. When the group norm shifts, the individual’s behavior and beliefs shift accordingly. In one experiment, researchers were able to increase or decrease people’s stereotypes of African Americans by simply telling them (falsely) that their level of stereotypic thinking was lesser or greater than their peers![ii]

Conformity bias also plays a role in climate denialism. I spoke with Jerry Taylor, founder of the Niskanen Center, a libertarian think tank that endorses climate taxation. Taylor, a former professional climate skeptic for the Cato Institute, sees climate as an “identitarian” issue. He says that many Republicans understand the reality of anthropogenic climate change but fear the repercussions of admitting it.

Progressive activists are more susceptible to conformity bias than other Americans, including very conservative Americans. Forty-two percent report feeling pressured by other progressives to think and talk a certain way.

Attribution error is the tendency to overemphasize personal characteristics and to ignore situational factors when judging someone’s behavior or success in the world. The thinking is: When I do something wrong, I can’t help it, but if someone else does the same thing, it’s because they’re a bad person. If I’m broke, it’s due to circumstances beyond my control, but when someone else is poor, it’s because they’ve made bad choices.

Attribution error can lead to stereotyping—if a white person is on welfare, it’s because they’re down on their luck, whereas a black welfare recipient must be lazy.

Cross-cultural dialogue expert David Campt notes that attribution error can also show up when a person of color, who has suffered innumerable incidents of actual discrimination, mistakenly believes they are being treated poorly because of their race.[iii]

Availability bias. A father and son wreck their car and are rushed to the hospital. The emergency room surgeon looks at the boy and cries, “Oh my God, that’s my son!” How can this be? If, like me, it took you more than a half-second to solve this riddle, you’re afflicted with availability bias; most, if not all, of the surgeons you’ve been exposed to have been men, and you were slow to realize that the surgeon in the riddle was the boy’s mother. I’m a feminist who has had two female surgeons, yet availability bias still got the better of me.

Availability bias leads people to fear air travel more than driving because, even though flying is far safer, grisly images of plane crashes are stuck in our heads. It makes people think crime is on the rise, even when it’s not, because there’s always a recent local crime event they can recall. It can also lead people to believe—or reinforce their belief—that Muslims commit more acts of terrorism or black men commit more violent crimes. So long as they have the image of the plane crashing into the World Trade Center or a mug shot of a black man flashed across their TV screen, availability bias can lead to the formation or perpetuation of stereotypes.

Intuitionism. University of Chicago political scientist Eric Oliver divides people into intuitionists and rationalists. Quick test: If you had to choose between putting a nickel from the ground in your mouth or wearing laundered pajamas that once belonged to Charles Manson, which would you choose? If you chose the latter, you’re probably a rationalist.

Intuitionists go with their gut—if something seems dangerous or disgusting, they’ll avoid it, no matter what the data shows. People who score high on intuitionism are the most likely to buy into conspiracy theories involving hidden powers that control events.

Gullibility. Development psychologist Stephen Greenspan is a leading expert in the study of gullibility, which he defines as a pattern of being repeatedly duped in the face of warning signs.[iv] Several personality traits can predispose someone to gullibility, including agreeableness, hypnotic suggestibility, high trust, and, paradoxically, paranoia, when it comes to believing in falsehoods that align with their distorted conception of themselves as persecuted victims.

Gullibility is not a function of intelligence or educational level. Rather, Greenspan says, gullibility is a function of cognitive laziness in which rationality lapses and the emotional desire to believe what one wants to be true takes precedence.[v] As Mark Twain said, “It’s easier to fool people than to convince them that they have been fooled.”

Certain circumstances predispose people to ideological cons: We’re more likely to believe falsehoods if they concern risks such as terrorism or crime, and more likely to believe authority figures.

Liberals sometimes see themselves as immune to gullibility, but Greenspan notes that liberals’ overestimation of their intellectual abilities can make them overconfident of their ability to spot deception; moreover, liberals’ idealism can make them naïvely trusting of the intentions of nefarious actors.[vi]

Knowing a bit about cognitive errors can help us keep political conversations from going off the rails, but proceed with caution—no one wants to be told they’re in the grips of cognitive bias. Better to ask questions that help the person see for themselves where their thinking may have gone astray. Better yet, catch your own cognitive errors and fess up to them—I have a gut feeling and hard evidence that to err is human.

 

Erica Etelson is a member of Braver Angels and the author of Beyond Contempt: How Liberals Can Communicate Across the Great Divide.

 

[i] Kahneman, Daniel. Thinking, Fast and Slow. New York, NY: Farrar, Straus and Giroux, 2015, 12-13.

[ii]Crandall, Chris, Amy Eshleman, Laurie O’Brien. (2002). Social Norms and the Expression and Suppression of Prejudice: The Struggle for Internalization. Journal of personality and social psychology. 82. 359-78.

[iii] Campt, David W. The White Ally Toolkit Workbook: Using Active Listening, Empathy, and Personal Storytelling to Promote Racial Equity. Newton Center, MA: I AM Publications, 2018, 194.

[iv] Greenspan, Stephen. Annals of Gullibility: Why We Get Duped and How to Avoid It. Westport, CT: Praeger, 2009, 2.

[v] Greenspan, 150-55.

[vi] Greenspan, 63-66.

More to explore

Leave a Comment

Your email address will not be published. Required fields are marked *

Braver Angels Support