On January 6, 1995, a large five-foot-six 270-pound middle-aged man robbed two Pittsburgh banks in broad daylight. He didn’t wear a mask or any sort of disguise. And he smiled at surveillance cameras before walking out of each bank.

Later that night, police arrested a surprised McArthur Wheeler. When they showed him the surveillance tapes, Mr. Wheeler stared in disbelief. “But I wore the juice,” he mumbled.

“But I wore the juice.”

McArthur Wheeler

Apparently, Mr. Wheeler thought that rubbing lemon juice on his skin would render him invisible to videotape cameras. After all, lemon juice is used as invisible ink, so as long as he didn’t come near a heat source, he should have been completely invisible.

Police concluded that Mr. Wheeler was not crazy or on drugs — just incredibly mistaken.

The unfortunate affairs of Mr. Wheeler inspired a series of psychological studies by Professor David Dunning of Cornell University and his graduate student Justin Kruger. They reasoned that almost everyone holds favorable views of their abilities in various social and intellectual domains. However, some people are unaware of their lack of abilities and mistakenly assess their abilities as much higher than they actually are. This ‘illusion of confidence’ is also called the ‘Dunning-Kruger effect’ and describes the cognitive bias to inflate self-assessment.

To investigate this phenomenon in the lab, Dunning and Kruger designed some clever experiments. In one study, they asked undergraduate students a series of questions about grammar, logic, and jokes, and then asked each student to estimate his score overall as well as his relative rank compared to the other students. Interestingly, students who scored the lowest always overestimated how well they did — by a lot. Students who scored nearest the bottom estimated that they had performed better than two-thirds of the other students!

But this ‘illusion of confidence’ extends beyond the classroom. In another study, Dunning and Kruger left the lab and went to a gun range. There, they quizzed gun hobbyists about gun safety, and again those who answered the least questions correctly wildly overestimated their knowledge about firearms. Today, if you watch any talent show on television, you will see the shock on the faces of contestants, who do not make it past auditions and are rejected by the judges. Sure, it seems almost comical to us, but they are genuinely unaware of how much they have been misled by their illusory superiority.

Whether we are designing a rigorous scientific study or raising children, we depend on knowledge, wisdom, or understanding to be successful and satisfied with the different parts of our lives. Sometimes we try things that lead to favorable outcomes, but other times – like the lemon juice hypothesis of McArthur Wheeler – our approaches can be imperfect, irrational, inept, or just plain stupid.

“Ignorance more frequently begets confidence than does knowledge.”

Charles Darwin

The problem is that when people are incompetent, not only do they reach wrong conclusions and make unfortunate choices, but also they are robbed of the ability to realize their mistakes. Instead of being confused, perplexed, or thoughtful about their erroneous ways, incompetent people insist their ways are correct. As Charles Darwin said, “Ignorance more frequently begets confidence than does knowledge.”

A simple example of this is driving ability. One study found that 80% of drivers rate themselves as above-average drivers. Similar trends have been found when people rate their relative popularity and cognitive abilities, but illusions of superiority are not always so mundane and can have real consequences.

Consider the anti-vaccination movement. A group of people with no medical or scientific qualifications are refusing to vaccinate their children for fear of them developing autism. Even though there is no scientific link between vaccines and autism, their erroneous opinions are so loud and convincing that they have caused the reemergence of diseases that had been previously eradicated in the United States. Globally, the anti-vaccination movement has caused the resurgence of many treatable diseases, which you can visualize on this interactive map made by the CDC.

We live in a golden age of rational — or irrational — ignorance. A few weeks ago, Kellyanne Conway used the phrase ‘alternative facts’ to defend the false statements made by the White House Press Secretary about the number of attendees at President Donald Trump’s inauguration ceremony. Immediately, social media erupted against the appropriation and acceptance of falsehoods.

The problem is that conventional educational methods of simply stating the facts are not effective persuasive methods.

The problem is that conventional educational methods of simply stating the facts are not effective persuasive methods. In fact, this approach can backfire. In 2013 researchers tracked the effect of high school biology classes in Oklahoma on students’ understanding of the theory of evolution. After the class, students were more confident in their knowledge of evolution and had more accurate statements about it. However, misconceptions about evolution also increased. For example, when asked how strongly they agreed with the phrase “Variation among individuals is important for evolution to occur,” the percent of students who strongly agreed increased from 11 to 22%, but so did the percentage for students who strongly disagreed, which rose from 9 to 12%. In fact, the only response that uniformly decreased after the biology class was “I don’t know.” This type of education, it seems, only instills errors that we retain.

In the classroom, some of the best methods for disarming misconceptions use the Socratic method, in which teachers present statements that the class discusses together to arrive at a logical conclusion. Unfortunately, we live in a world of rampant misinformation in environments that cannot be so well controlled; the Internet and news media make it almost impossible to decipher truth from fallacy. Nevertheless, it is still possible to make facts louder than lies. To eradicate ‘alternative facts,’ simply stating actual facts is not enough. It is necessary to state the misconception and then explain the truth. YouTubers like Veritasium and Vsauce use this to effectively dispel myths and explain truths in ways that actually stick.

What can you do to convince people of the errors in their beliefs? Michael Shermer offered this advice in his recent article in Scientific American:

1. keep emotions out of the exchange, 2. discuss, don’t attack (no ad hominem and no ad Hitlerum), 3. listen carefully and try to articulate the other position accurately, 4. show respect, 5. acknowledge that you understand why someone might hold that opinion, and 6. try to show how changing facts does not necessarily mean changing worldviews.

It is time for us as a society to wipe the lemon juice off our faces. It doesn’t wash off easily, but it can be removed. And the best soap for the job is hard facts.

~

Written by Kate Fehlhaber.

Images made by Kate Fehlhaber and Kayleen Schreiber.

~

References:

Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of personality and social psychology, 77(6), 1121.

McCormick, I. A., Walkey, F. H., & Green, D. E. (1986). Comparative perceptions of driver ability—a confirmation and expansion. Accident Analysis & Prevention, 18(3), 205-208.

Roese, N. J., & Olson, J. M. (2007). Better, stronger, faster: Self-serving judgment, affect regulation, and the optimal vigilance hypothesis. Perspectives on Psychological Science, 2(2), 124-141.

Shermer, M. (2017). How to Convince Someone When Facts Fail. Scientific American.

Yates, T. B., & Marek, E. A. (2013). Is Oklahoma really OK? A regional study of the prevalence of biological evolution-related misconceptions held by introductory biology teachers. Evolution: Education and Outreach, 6(1), 6.

Zuckerman, E. W., & Jost, J. T. (2001). What makes you think you’re so popular? Self-evaluation maintenance and the subjective side of the” friendship paradox”. Social Psychology Quarterly, 207-223.

Free to use Einstein image and Trump image

Profile photo of Kate Fehlhaber

Kate Fehlhaber

Kate graduated from Scripps College in 2009 with a Bachelor of Arts degree in Neuroscience, completing the cellular and molecular track with honors.As an undergraduate, she studied long-term plasticity in models of Parkinson’s disease in a neurobiology lab at University of California, Los Angeles.She continued this research as lab manager before entering the University of Southern California in 2011 and then transferring to UCLA in 2013.She completed her PhD in 2017, where she studied the first synapse of sight.Listen to her talk about her vision research, science communication, photography, and other hobbies in this recent episode of Forbes podcast "The Limit Does Not Exist."
Profile photo of Kate Fehlhaber

Latest posts by Kate Fehlhaber (see all)

Kate Fehlhaber

View posts by Kate Fehlhaber
Kate graduated from Scripps College in 2009 with a Bachelor of Arts degree in Neuroscience, completing the cellular and molecular track with honors. As an undergraduate, she studied long-term plasticity in models of Parkinson’s disease in a neurobiology lab at University of California, Los Angeles. She continued this research as lab manager before entering the University of Southern California in 2011 and then transferring to UCLA in 2013. She completed her PhD in 2017, where she studied the first synapse of sight. Listen to her talk about her vision research, science communication, photography, and other hobbies in this recent episode of Forbes podcast "The Limit Does Not Exist."

3 Comments

  1. Hi, liked your article, your arguments and ideas yet felt left hanging cause I wanted to know about the neural mechanisms involved. I’ve put a lot of thought into why our human brain love and cling to beliefs, I think it has to do with our love affairs with story, i.e. Films, books, poems song, etc. Seems to me our brains are wired to be “what if” machines that serve us well most of the time but get us in trouble a lot of the time i.e. Divorces, crime, addictions etc.

    1. Hi Ken! Thanks for your thoughtful comment! This 2012 paper by Fleming and Dolan (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3318765/) might offer the best insights into the neural basis of metacognitive accuracy, or that overconfidence that occurs when people judge that they will remember more information on a future test then they actually do. Basically, the rostral and dorsolateral prefrontal cortex (PFC) is important for the accuracy of retrospective judgements of performance. Also, prospective judgements of performance may depend upon medial PFC. Hope that helps!

    2. Another paper worth reading is the 2016 fMRI data analysis (https://academic.oup.com/scan/article/11/12/1942/2544442/Neural-correlates-of-metacognitive-ability-and-of) that shows that: “The results showed that higher metacognitive accuracy was associated with a decrease in activation in the anterior medial prefrontal cortex, an area previously linked to metacognition on perception and memory. Moreover, the feeling of confidence about one’s choices was associated with an increase of activation in reward, memory and motor related areas including bilateral striatum and hippocampus, while less confidence was associated with activation in areas linked with negative affect and uncertainty, including dorsomedial prefrontal and bilateral orbitofrontal cortex. This might indicate that positive affect is related to higher confidence thereby biasing metacognitive decisions towards overconfidence. In support, behavioural analyses revealed that increased confidence was associated with lower metacognitive accuracy.”

Leave a Reply