Knowing Neurons
Neuro NewsScience Policy

The Seductive Allure of Neuroscience: Can Info About Your Brain Change Your Mind?

By Paige Nicklas

What Is the SANE effect?

Imagine you’re at home watching your favorite evening show after a long day at work, when a commercial for a new vitamin comes on. The ad claims their vitamin will help you focus and be more productive in all areas of life! They show images of brain scans of people who have taken the new vitamin, highlighting the areas the vitamin enhances so it can “work its magic.” 

Would you be persuaded to purchase and try the new vitamin? Now consider the same commercial, but instead of providing evidence of their vitamin’s effectiveness with brain scans, they show changes in scores on a cognitive test. Could that also persuade you? What if they don’t provide any scientific explanations for how the vitamin works? The “SANE” effect, or the Seductive Allure of Neuroscience Explanations, explains why this kind of persuasion works. This phenomenon suggests that people are more readily swayed to believe a concept when it is accompanied by a neuroscience-based explanation, whether conveyed by words or images, and can allegedly influence decision making or change the minds of the people consuming the neuro-info being presented to them (Weisberg et al., 2008). However, studies over the last decade have attempted to quantify the details of this effect, with mixed results. Some have even failed to demonstrate the SANE effect altogether. 

Recently, a meta-analysis, a method for examining data by collecting multiple studies that ask a similar research question and combining the results to look at overall trends, was published looking at studies on the SANE effect. The meta-analysis synthesized sixty of these experiments and aimed to understand exactly how the SANE effect works, if it even truly works at all. 

What Did They Find?

The authors, Bennett and McLaughlin (2023), wanted to reveal the strength of the SANE effect. To do so, they investigated the effect size of all of the experiments’ findings together. They concluded that the effect size was significant, but small. This confirms its existence, but raises questions about the specifics of the SANE effect. Are there multiple types? Can it only be elicited under certain conditions?

To begin answering these questions, they suspected that the wide range of results from previous studies investigating the SANE effect may be due to differences in how the studies were designed.

To begin answering these questions, they suspected that the wide range of results from previous studies investigating the SANE effect may be due to differences in how the studies were designed. Firstly, some studies were within-subjects, meaning each participant is exposed to the different types of neuroscience or non-neuroscience explanations. Then, an individual’s perception of one type is compared to their own perception of the other type. Other studies were between-subjects, where participants were split into separate groups. The first group was randomly assigned to see only the neuroscience explanations and the second group only saw non-neuroscience explanations. After participants were presented the assigned explanation of their group, the results of each group are compared to each other. The first type of study design, within-subjects, had a stronger effect. The authors propose the reason for this may lie in the ability of individuals in a within-subjects study to directly compare a neuroscience explanation to one without. Whereas in a between-subjects study, the individuals are only exposed to one type of explanation, and cannot compare it to anything else.  

Furthermore, the meta-analysis investigated whether neuroscience explanations using text or brain imagery had a larger impact on perception and satisfaction, and found that text had stronger effects. The authors expected these results because previous studies failed to replicate the original study that had found a strong effect of brain-imagery (McCabe & Castel, 2008). One of those studies speculated that a potential reason for which text-based effects are stronger may be due to the quality of information the neuroscience explanation is being added to. They suggested that neuroscience text (e.g. “Brain scans indicate that…”) added to an already vague or circular explanation of some topic will have a stronger SANE impact, than if a brain image (e.g. an image of an MRI scan) is added, because the text artificially adds context but the image does not. To fully explore this, more research should look at how varying levels of the base explanation quality are impacted by then adding irrelevant neuroscience information through text and images (Michael et al., 2013).

The authors additionally explored how the SANE effect was influenced by the type of questions a study asked its participants. They found that the SANE effect was stronger in participants’ ratings of quality, satisfaction, or metacomprehension (how well a participant thought they understood the explanation.) However, there was a very weak effect found when comparing the responses of participants regarding their agreement or disagreement with the explanations provided. This suggests that neuroscience info’s ability to change the convincingness or believability is low. Even further, while studies found that neuroscience information increased feelings of understanding and satisfaction, actual objective measures in overall understanding did not increase. Participants said they understood better, but there wasn’t an actual change in their comprehension, nor did this increased feeling of understanding convince people to agree or disagree with the information presented to them. 

Taken all together, it seems that the SANE effect is complex and heavily depends on the techniques used and questions asked. Overall, neuroscience information does affect people’s perception of information but we are just beginning to understand the intricacy of how it is accomplished.  

For a great summary of the study, view the YouTube video created by the authors!

Why Are These Types of Studies & Findings Important? 

Meta-analyses, like the one described above, are important for scientists to do and are important for non-scientists to learn about. When asked about the importance of this type of work, one of the authors of this study, Dr. Peter McLaughlin, explained: “You have to know that one study being done shouldn’t really change anything by itself.” He emphasizes that single scientific studies cannot stand alone, and shouldn’t be presented as though they can, “You want a large number of studies…because there’s always a larger picture.” 

Understanding how a lay audience, or people who are non-experts on a particular subject, perceive information about that subject is vital for how scientists decide to communicate their research.

Understanding how a lay audience, or people who are non-experts on a particular subject, perceive information about that subject is vital for how scientists decide to communicate their research. McLaughlin stresses the importance of maintaining the integrity of the research and avoiding over-simplification. He references examples of how neurotransmitters like serotonin and dopamine have been dramatically simplified in popular media, and how this can cause misunderstanding. “It is okay to say, ‘This is a complicated problem,’” said McLaughlin. Lead author, Elizabeth Bennett, adds that “communicating science in a way that’s informative, but doesn’t overstep the research is important.” 

Both authors explain that learning more about how lay-people perceive science will help strengthen the relationship between science and the public. If that relationship can be built through excellent science communication, it benefits all. It has been found that increased trust in science makes people better able to evaluate the quality of scientific information, and this makes them less likely to be influenced by low-quality evidence, or even intentional science misinformation (Rosman & Grösser, 2023). One thing that should be avoided, though, when discussing the implications of work like this is assuming that public distrust in science is merely due to a lack of sufficient knowledge, and that false beliefs can be corrected by simply providing facts. This is known as the “Information Deficit Model” of science communication, and has long been critiqued because it fails to take into consideration the external factors that play a role into how an individual’s opinion and perception of science is formed (Ecker et al., 2022). These factors can include things like personal experiences with science, political, cultural and religious beliefs, and emotions toward science (Ecker et al., 2022, Simis et al., 2016). Studies such as Bennet & McLaughlin’s meta-analysis can begin to combat the deficit model, with the ultimate goal of establishing science communication strategies that are rooted in evidence from social sciences (Simis et al., 2016) and the knowledge behind how this complex relationship forms. More research on this will be important so scientists can adapt how they design studies, interpret their results, and share those results with the public.

It has been found that increased trust in science makes people better able to evaluate the quality of scientific information, and this makes them less likely to be influenced by low-quality evidence, or even intentional science misinformation

This meta-analysis reveals that neuroscience information increases satisfaction toward an explanation, but does not necessarily convince people to find a fact or topic more believable nor increase their objective understanding. Some techniques appear to be more impactful than others, like using text instead of imagery. More research is needed though, to truly understand the nuances of the SANE effect. Still, neuroscience explanations continue to be used in media, marketing, and even the law. So, maybe take some extra consideration when you see a powdered drink mix that claims to promote better sleep, what your brain looks like when looking at your favorite political candidate, or brain training games and apps, and remember that researchers are still trying to uncover exactly how neuroscience information persuades us. 

~~~

Written by Paige Nicklas
Illustrated by Kayla Lim
Edited by Zoë Dobler, Caitlin Monserrat, and Shiri Spitz Siddiqi

~~~

Become a Patron!

References

Bennett, E. M., & McLaughlin, P. J. (2024). Neuroscience explanations really do satisfy: A systematic review and meta-analysis of the seductive allure of neuroscience. Public Understanding of Science, 33(3), 290-307. https://doi.org/10.1177/09636625231205005

Ecker, U.K.H., Lewandowsky, S., Cook, J., Schmid, P., Fazio, L.K., Brasher, N., Kendeou, P., Vrage, E.K., & Amazeen., M.A. (2022). The psychological drivers of misinformation belief and its resistance to correction. Nature Reviews Psychology, 1, 13–29. https://doi.org/10.1038/s44159-021-00006-y 

Michael, R.B., Newman, E.J., Vuorre, M., Cumming, G., Garry, M (2013) On the (non)persuasive power of a brain image. Psychonomic Bulletin & Review 20(4): 720–725.

Rosman, T., & Grösser, S. (2023). Belief updating when confronted with scientific evidence: Examining the role of trust in science. Public Understanding of Science. Advance online publication. https://doi.org/10.1177/09636625231203538

Simis, M. J., Madden, H., Cacciatore, M. A., & Yeo, S. K. (2016). The lure of rationality: Why does the deficit model persist in science communication?. Public Understanding of Science, 25(4), 400–414. https://doi.org/10.1177/0963662516629749 

Weisberg, D. S., Keil, F. C., Goodstein, J., Rawson, E., & Gray, J. R. (2008). The seductive allure of neuroscience explanations. Journal of Cognitive Neuroscience, 20(3), 470–477. https://doi.org/10.1162/jocn.2008.20040

Author

  • Paige Nicklas

    Paige is a PhD student at the University of Rochester School of Medicine and Dentistry, studying neuroscience. She has a BS in Psychology and an MS in Neuroscience, and is interested in researching the blend of these two disciplines. Her current research investigates cognitive-motor interactions in typically and neurodivergently developing children and young adults, exploring the impact of movement on cognitive performance in early life. Outside of the lab, she is passionate about expanding science education and communication for all, and encouraging public engagement with science. In her free time, she enjoys reading, drawing, and caring for her many houseplants.

Paige Nicklas

Paige is a PhD student at the University of Rochester School of Medicine and Dentistry, studying neuroscience. She has a BS in Psychology and an MS in Neuroscience, and is interested in researching the blend of these two disciplines. Her current research investigates cognitive-motor interactions in typically and neurodivergently developing children and young adults, exploring the impact of movement on cognitive performance in early life. Outside of the lab, she is passionate about expanding science education and communication for all, and encouraging public engagement with science. In her free time, she enjoys reading, drawing, and caring for her many houseplants.