The neuroscience of false beliefs
Lead authors: Andreas Olsson, Department of Clinical Neurosciences, Karolinska Institutet, and Björn Lindström, Department of Clinical Neurosciences, Karolinska Institutet
What we believe is not only a result of our own reasoning, but also of the beliefs of people around us. Our brains are wired to consume information that is liked by our peers. That way, social reinforcers online – in the form of likes, comments and shares – can build a basis for what individuals believe to be true or false.
Why do people adopt beliefs, such as that the earth is flat, which are clearly incompatible with established truth? Like with other beliefs, it depends on the nature of their prior beliefs, and goals, as well as learning and reasoning processes. Our knowledge about the psychological and neural mechanisms that promote false beliefs is still nascent, but existing research in psychology and neuroscience has described how people process information that is unexpected, emotional, and politically divisive – features typical of false information encountered on social media.
First, information that is unexpected and emotionally (especially negatively) loaded, tends to grasp our attention, which depends on visual cortices and attentional networks in parietal and frontal brain regions. In addition, attention also depends on activity in deep brain nuclei that are sensitive for events that can be threatening and important goals of the individual.
Various goals of the individual can be important. For example, if social goals, such as promoting one’s own social status or belonging to a particular group, are more important than the goal to seek and represent the truth, then attention is biased towards whatever promotes the more important goals. Research has shown that the orbitofrontal cortex, a brain region located just behind the eyes, is critical for computing the value of different goals. Other prefrontal regions serve to monitor information that deviates from what is in line with one's own moral or political convictions.
People who strongly identify with different political ideologies tend to distort their conclusions in line with their political identities when asked to reason about politically divisive facts. At first glance, one could think that this is due to politically motivated reasoning. However, it can also be viewed as rational inferences considering the individual’s prior belief system (Botvinik-Netzer et al., 2023). Networks of brain regions linked to memory retrieval and belief updating, are likely to support these processes.
Recent research shows that simple learning mechanisms, known to engage regions in the brain’s system that run on dopamine, can explain how humans are motivated to increase the consumption of information that is liked by others – in particular by those belonging to the same group as the individual identifies with. For example, models of instrumental learning explain why people engage with social media (Lindström et al., 2021, Ceylan et al., 2023) and how morally outraged messages become viral (Brady et al., 2021). The social reinforcers that are at play in social media might thus provide a basis for what individuals believe to be true. Indeed, research has shown that social forms of learning can be at least as powerful, and draw on many of the same neural principles as learning based on own, personal experiences (Olsson et al., 2020). The importance of understanding the power of social influences is further underscored by recent findings showing that the more a statement is “liked” by others, the more the statement is subsequently rated as true (Granwald et al., 2023).
In sum, although much is known about the psychological and brain bases of acquiring and updating beliefs in general, much less is known about how we process false beliefs encountered online. Because the lack of a shared knowledge about what is true can fuel polarization, conflicts and other destructive behaviours (Brady et al., 2023), we need to increase efforts to understand the basic psychology and neuroscience of false beliefs.
Botvinik-Nezer, R., Jones, M. & Wager, T.D. (2023). A belief systems analysis of fraud beliefs following the 2020 US election. Nat Hum Behav.
Brady, W. J., McLoughlin, K. L., Torres, M. P., Luo, K. F., Gendron. M. & Crockett, M. J. (2023) Overperception of moral outrage in online social networks inflates beliefs about intergroup hostility. Nat. Hum. Behav.
Ceylan, G., Anderson, I. A. & Wood, W. (2023). Sharing of misinformation is habitual, not just lazy or biased. Proc. Natl. Acad. Sci. U S A 120(4), e2216614120
Granwald, T., Pärnamets, P.,Czarnek, G., Szwed, P., Kossowka, K., Lindholm, T., & Olsson, A. (2023). Conformity and Partisanship Impact Perceived Truthfulness, Liking and Sharing of Tweets, abstract presented at the Soc. Personality and Social Psychology, 2023 Annual Convention, GA, US., Feb, 24
Kepios (2023). Data report. Report. Online: https://datareportal.com/reports/digital-2022-global-overview-report retrieved May 11, 2023.
Lindström, B., Bellander, M., Schultner, D.T., Chang, A., Tobler, P. N., Amodio, D. M. (2021). A computational reward learning account of social media engagement. Nat. Commun. 12(1311).
Osmundsen, A., Bor, P. B., Vahlstrup, A., Bechmann, M., & Petersen B. (2022). Partisan polarization is the primary psychological motivation behind “fake news” sharing on Twitter. Am. Polit. Sci. Rev.
Olsson, A., Knapska, E. & Lindström, B. (2020). The neural and computational systems of social learning. Nat. Rev. Neurosci. 21, 197–212.
Rathje S., Van Bavel J. J., van der Linden, S. (2021). Out-group animosity drives engagement on social media. Proc. Natl. Acad. Sci U S A 118(26), e2024292118.
Vosoughi, S., Roy, D. & Aral, S. (2018). The spread of true and false news online. Science 359(6380), 1146-1151.
About the authors
Andreas Olsson is a professor at the Psychology Division of the Department of Clinical Neurosciences at Karolinska Institutet. He leads the Emotion Lab at the institute, aiming to understand how emotional information is transmitted, learned and regulated in social situations.
Björn Lindström is a senior research specialist at the Psychology Division of the Department of Clinical Neurosciences at Karolinska Institutet. He is also a member of the Emotion Lab.
Are you unsure what a specific term means? Check our list of key terms.
Explore all chapters
News & events
General news | 2023-11-30
“We need to transition back within all Planetary Boundaries”
Phase out fossil fuels, transform the food system, and remove pressures on all planetary boundaries — in a just way. Centre researchers send a clear message to all participants at COP28
General news | 2023-11-28
Putting resilience at the heart of COP28
Stockholm Resilience Centre will host and join a range of activities at COP28. The participation is done in close collaboration with the Global Resilience Partnership and our development programme SwedBio
General news | 2023-11-27
First Swedish national citizen assembly on climate could show the way to transformation
Can a citizen assembly come up with ideas on how Sweden could live up to the Paris Agreement? This will be tested in a new collaboration of Swedish universities, which Stockholm Resilience Centre is a part of
General news | 2023-11-15
Researchers from the Centre listed among the world’s most cited
Four researchers associated with Stockholm Resilience Centre made it onto the 2023 Clarivate Analytics overview, a ranking of the most cited scientists in the world
Research news | 2023-11-14
Norms must change in order for academia to walk the talk on sustainability, new paper argues
Sustainability research is rarely sustainable, neither for the planet nor for academics, argues a new paper. In it, researchers call for a rethink of current norms and practices
Research news | 2023-11-13
New research maps 14 potential evolutionary dead ends for humanity and ways to avoid them
Humankind risks getting stuck in 14 evolutionary dead ends, ranging from global climate tipping points to misaligned artificial intelligence, chemical pollution, and accelerating infectious diseases