Understanding the Psychology of Misinformation:
Why People Believe in False Information?

Dr. Nuurrianti Jalli
Assistant Professor of Communication Studies, Northern State University

Contact Dr. Nuurrianti Jalli: [email protected]

Understanding information disorder from all perspectives is crucial for us to tackle the problem.

One of the ways to understand information disorder is to explore the psychological factors to information disorder. In this article, I will be highlighting a few psychological biases and principles that can help us understand why people fall victims to misinformation.

But, before we go any further, it is pivotal for us to understand that human beings evolved to prefer simple and easy information that requires little cognitive effort. We also prefer to be in a cognitively comfortable state, which means that any information that goes against what we believe in tends to be rejected unless we are willing to process the information.

Understanding our preference towards simpler information with minimal cognitive labors and our preference for cognitive comfort would help us understand why some of these biases contribute to information disorder.

Confirmation bias

Confirmation bias is the tendency of people to favor information that confirms their personal beliefs and hypotheses. Say, for example, if someone is exposed to certain information that is in line with his/her personal belief and matches with his/her overall belief system, this individual will tend to believe in such information. To confirm one’s existing belief requires little cognitive effort as compared to processing unfamiliar or new information. Unfamiliar information requires cognitive labor to process, while information contrasting our values would create cognitive discomfort. Human beings prefer to be cognitively comfortable; therefore, confirming existing bias will help avoid extra cognitive labors and avoid cognitive dissonance, which suggests that we like to feel that we are consistent in our beliefs and that our actions and our beliefs match.

Illusory truth effect

The more we are exposed to certain information, the more we think and believe it as the truth. There are two ways illusionary truth effect happens. The first one is through repetitive messages. Repetitive messages create a sense of familiarity and cognitive ease. A method often used by advertisers by exposing people with repetitive ads to create a sense of familiarity. This consequently creates a sense of favor. And secondly, people often view high contrast media or information as more credible. High-contrast media and clear information (easy-to-read fonts, high-resolution images/videos) create cognitive ease, which influences people to view the information more favorably.

Anchoring bias

The first piece of information we hear about a topic is often the one that sticks in our minds. If the first thing we hear about a topic is false, we might still take it on and use it as a measure for evaluating other information. Again, one of the reasons for this is that new information requires cognitive effort to process.

Tainted truth effect

Tainted truth effect can be observed in many circumstances. In journalism, tainted truth effect can happen when fact-checking agencies or media outlets erroneously labeled accurate information as misinformation. When informative news is wrongly labeled as inaccurate, these false warnings reduce the news’ credibility, albeit later corrected – as media audiences are often influenced by the first information they received.

Dual-process theory

In psychology, a dual process theory explains how thought can arise in two different ways or as a result of two different processes. Often, the two processes consist of an implicit, unconscious process (system 1: an automatic process that requires little effort) and an explicit, conscious process (system 2: an analytical process that requires more effort). As human beings, we strive for cognitive ease, so we don’t want to think too much about something since it requires a lot of effort – so information (and false information) that requires little effort for us to process, we tend to believe it as the truth. Thus, the easier something is to process, the more likely we think it’s true.

Robert Cialdini’s principles of persuasion

Dr. Robert Cialdini in his book, the Psychology of Persuasion, describes the six core principles of persuasion that can be used to encourage people to change their minds, boost sales, and also make misinformation more believable. In particular, two principles can make it more likely for people to adopt false information – 1) social proof and 2) the authority principle. Social proof refers to the tendency people have of adopting behaviors and ideas they see others have. If an item or a belief becomes trendy, more people will adopt it. This is especially true when we are not sure of something and when we find that the people who have a specific idea are similar to us. The second principle is ‘authority’. We are more inclined to adopt ideas or beliefs that are promoted by someone who is perceived as an authority. This can relate to two things. The trappings of authority can make the information more believable. On the other hand, celebrities and people who are viewed as important and authoritative in one area are also more likely to be believed. We are more likely to adopt an idea from a celebrity or an authority in a whole other field.

In-group bias

People are more likely to believe misinformation shared by what they perceive to be from their in-group. This is because human beings are social beings, and we prefer to be accepted by our peers. Therefore, it is understandable that psychologically we tend to favor the group we belong to and give them more credence to what they tend and feel. We also want to be a part of the group, so we avoid being an anomaly.

Self-serving bias

The self-serving bias makes us evaluate ourselves more positively. We might believe that the information we believe is more likely to be true because we believe it. If we find an idea convincing to us or interesting, we might think it is more likely to be true because we see it in a positive light and we believe in it.

Further readings

  • Acerbi, A. (2019). Cognitive attraction and online misinformation. Palgrave Communications, 5(1), 1-7.
  • Cialdini, R. B., & Cialdini, R. B. (2007). Influence: The psychology of persuasion (Vol. 55, p. 339). New York: Collins.
  • First Draft (2021). The Psychology of Misinformation. Accessible at
  • Freeze, M., Baumgartner, M., Bruno, P., Gunderson, J. R., Olin, J., Ross, M. Q., & Szafran, J. (2020). Fake claims of fake news: political misinformation, warnings, and the tainted truth Effect. Political Behavior, 1-33.
  • Gampa, A., Wojcik, S. P., Motyl, M., Nosek, B. A., & Ditto, P. H. (2019). (Ideo) logical reasoning: Ideology impairs sound reasoning. Social Psychological and Personality Science, 10(8), 1075-1083.
  • Greifeneder, R., Jaffe, M., Newman, E., & Schwarz, N. (2021). The psychology of fake news: Accepting, sharing, and correcting misinformation (p. 252).
  • Kahneman, D. (2011). Thinking, fast and slow. Macmillan.
  • Weir, K. (2017, May). Why we believe alternative facts. American Psychological Association. Accessible at