Knowing Neurons
Brain BasicsNeuro SciFi

The Life and Times of the 10% Neuromyth

How much of your brain are you using right now?

The science fiction answer would be “10%.” And, if you’ve watched enough bad sci-fi, using 100% of your brain would unlock your full intellectual and cognitive abilities. Einstein learned how to use 100% of his brain, right?

“Let’s explore the life and overdue times of the 10% brain myth.”   

Wrong. So wrong and so illustrious is the “10% brain myth” that it has its own Wikipedia page.

Yet, the myth just won’t die. A survey suggests that 65% of people are still stuck on the idea that they only use 10% of their brains. Why?

I was reminded of the 10% myth while watching the science-fiction television series, Black Mirror. Black Mirror, created by English satirist Charlie Brooker, explores technological horrors through standalone episodes that are no stranger to neuro-centric plotlines. Past episodes have explored the nuances of what it would be like to have mind-reading technology or even implants to record (and playback) our every memory.

The most recent episode, Black Museum, follows a washed-up doctor who creates and sells neurotechnology that pushes his patients down spirals of pain addiction and embodied consciousness. These are fine speculative science plotlines until the doctor cites a version of the 10% brain myth to explain how his neurotechnology works: “Even on a good day, we only use 40% of our brain capacity.” Of course, says the show, that leaves 60% of “empty” brain capacity.

This comes from the same show which has cleverly used appropriate brain facts to provide thoughtful musings about science’s interactions with human technology. For an otherwise scientifically considerate show, the crutch of the 10% myth seemed so lazy, so jarring, so egregious, that I wondered: What about the 10% myth is so captivating? And how does the myth keep worming its way into science fiction?

Let’s explore the life and overdue times of the 10% brain myth.

 

Who started the 10% myth?

Dr. Barry Beyerstein, writing for Scientific American, has gone down the rabbit hole of the 10% myth. He reported that there is no one origin of the myth, though some have cited journalist Lowell Thomas as a possibility. In his 1937 introduction to Dale Carnegie’s How to Make Friends and Influence People, Thomas incorrectly cited 19th century philosopher William James as having said that most humans only develop 10% of his or her “latent mental ability.” Cue the fanaticism.

“Thus, the idea that we only use ‘10% of the brain’ might stem from scientific findings on specialized regions of the brain that are known to be particularly active during specific tasks.”

A broader answer could be the thresholded images—those with bright, colorful hotspots— published in papers that use human neuroimaging methods, such as PET and fMRI. These images, while crucial for research, have been the unfortunate bane of many a neuromyth. The earliest images of functional brain scans were exciting to people, who could now look at the activity of the human brain. Bright blobs highlighted distinct regions of the brain as metabolically active, causing headlines to read, “Your brain lights up during this task!”

Sensationalized statements like this, vaguely making claims about the brain’s energy consumption, have likely helped shape the faulty concept that only small percentages of the brain are active at any given time. What a brain scan and bad headline don’t explain, however, is that the bright blobs are active relative to other areas of the brain and are task-specific. The rest of the brain is active, just not as much as regions of the brain that “light up.” Thus, the idea that we only use “10% of the brain” might stem from scientific findings on specialized regions of the brain that are known to be particularly active during specific tasks.

 

Where else have we seen the 10% myth?

Dr. Beyerstein also observed that the myth first made its rounds in popular culture as early as the 80’s, in films like Flight of the Navigator, which features a protagonist who is able to memorize new information about the galaxy because scientists unlock 90% of his brain. The myth re-emerged in the 2000’s, seemingly as sci-fi films stepped away from space and into the human body. First, the hit tv show Heroes invoked the unlocked potential of the brain to describe its characters’ superhuman abilities. Next, the 2011 film Limitless (which had its own brief spin-off tv series) starred Bradley Cooper as a writer whose mental and physical abilities soar after being heightened by an adderall-like drug. This drug, the movie suggests, enables him to use more than 20% of his brain. Finally, the 2014 Luc Besson film Lucy featured Scarlett Johansson as a woman who gains psychokinetic abilities after a drug unlocks a supposedly dormant 90% of her cortex.

Subtler interpretations of the myth have even appeared in comedies, like Seinfeld (1996) and The Simpsons (1998). Movies like Inception (2010) have also toyed with the idea of an unlocked brain potential in order to tell a story. In short, there is no scarcity of fictional characters using 100% of their brains.

But the plot is always the same. Each story features a character whose cognitive abilities are heightened, almost inhumanly so, by engaging more of the brain. This rarely comes without a cost, though. But more on that later.

 

What would happen if you used 100% of your brain?

This question is difficult to consider, because the concept that brain usage can be measured by exact percentages is so removed from biology. Nonetheless, here are some biologically rooted interpretations.

One interpretation? A seizure.

Seizures are defined by excessive and synchronous neural activity. If we wanted to use 100% of our brains, to stimulate each of the brain’s 100 billion neurons to maximum capacity firing would result in a likely fatal physical experience. To hope for synchronous excitatory activity across the cortex is, in ways, synonymous to a grand mal seizure. This is the most severe type of seizure and leads to loss of consciousness and severe muscle contractions–not the unlocking of superhuman abilities.

“But what if our brains are already always active?”

But what if our brains are already always active? This leads us to another way of interpreting brain activity which hearkens back to neuroimaging.

New imaging methods suggest that many parts of the brain are paradoxically active at baseline, or at rest. These regions of the brain, which have been found to be more active and synchronized during rest than during tasks, is termed the default mode network (DMN). The DMN consists of regions focused in the ventromedial prefrontal cortex and the posterior cingulate cortex, which are thought to exhibit higher activity specifically when at rest or during “non-task” states like mind-wandering or casual rumination. Evidence in support of the DMN doesn’t mean that the brain is 100% active– but it does suggest that large portions of the brain are never truly dormant, as the 10% myth might otherwise suggest.

Studies have suggested that atypical DMN activity might be linked to clinical disorders, such as ADHD and depression. Activity within the DMN has also been hypothesized to be linked to higher-cognitive functions, like those portrayed in Limitless and Black Mirror. Perhaps our brains are already optimized.

 

Why do we perpetuate this myth?

Metaphor can help distill difficult ideas. But sometimes, metaphors obstruct the truth behind an idea, and science is no stranger to the folly of metaphor.

In ways, the myth that we use 10% of our brain is vested in the metaphor that CPUs are the brains of computers, and conversely, that our brains are the CPUs of the human body.

“We all wish we had more space, more storage, to devote towards cognitive tasks.”

Such metaphors also allow us to imagine that the same graphs we see on our computer screens– graphs that measure storage and memory– are applicable to the human brain. We are comforted to see that there is unused potential on our devices, that we have 90% of storage remaining, so it is easy to see why this same concept might also be consoling when we consider our brain’s capacity. We all wish we had more space, more storage, to devote towards cognitive tasks.

This leads us to another reason media perpetuates this neuromyth: good old wishful thinking.

We like to think that we could be better than we are. We like to think of ourselves as wielding some untapped power. Human potential is a lucrative business in our society beyond science fiction, and fulfilling human potential was once even a goal of cultural movements. Today, “brain boosters” are advertised like candy on television and in magazines. As neuroscientist David Eagleman has said, the 10% neuromyth is the “the neural equivalent to Peter Parker becoming Spiderman.” Perpetuating the myth that our brains are never fully utilized is an iteration of a metaphor that serves our wishful thinking, but in doing so, pushes us away from a fuller understanding of our neurobiology.

 

What lies ahead for the neuromyth?

Interestingly, reviews of the Black Mirror episode in question are overwhelmingly negative relative to the show’s usual applause. Moreover, reviews imply that audiences are weary of this question: what happens if we use more than 10% of our brains? Thanks to over-usage of the 10% myth, audiences don’t need to work hard to come to the conclusion that any character who unlocks 100% of their brain will meet a most unhappy ending. The implications of the 10% myth might no longer be thought-provoking to audiences, because we are learning just how much the question itself lacks much biological plausibility.

There are more interesting scientific queries to explore. Let’s hope science fiction finds them soon.

 

What are your least favorite neuro-myths? Respond in the comments below!

10 Percent Myth

Illustration by Kayleen Schreiber. 

Author

  • Gabrielle Torre

    Gabrielle-Ann is a PhD student at Georgetown University and studies the neural correlates of reading, IQ, and socioeconomic status. She is broadly interested in using neuroimaging methods to ask questions about human cognitive behaviors and abilities. Previously, she studied brain-behavior relationships in healthy aging at the University of Arizona, where she developed a love for literature and creative writing. She still enjoys reading and writing, as well as live music, gender studies, and eating.

Gabrielle Torre

Gabrielle-Ann is a PhD student at Georgetown University and studies the neural correlates of reading, IQ, and socioeconomic status. She is broadly interested in using neuroimaging methods to ask questions about human cognitive behaviors and abilities. Previously, she studied brain-behavior relationships in healthy aging at the University of Arizona, where she developed a love for literature and creative writing. She still enjoys reading and writing, as well as live music, gender studies, and eating.

4 thoughts on “The Life and Times of the 10% Neuromyth

  • It’s not so much about CPU. It’s about RAM. And there is possibly no quantifiable upper limit on storage capacity of the brain (reference Reber’s exponential storage theory). So, we use much of our CPU much of the time, but the hard drive space is perhaps limitless. Does that mean we use less than 10% of our brain? In layman’s terms, yes – we use far less than 10% of our storage potential by just about anyone’s measurements.

  • How does using 100% of our brain result to a seizure anyway? does it make sense or no?
    I guess it’s like CPU overheating or something. But isn’t our brain desingned to be used in full capacity all the time???

  • How would using your brain on full capactiy result to a seizure anyway? Is it like your CPU over heating stuff?? Or is it like RAM not enough so application crashes? But still wasn;t the human brain designed to operate on full capacity??

Comments are closed.