The effect of corrections and corrected misinformation
Emily Thorson and. JianingLi
Misinformation poses a normative problem because it has the potential to distort both beliefs and attitudes, causing people to believe things that are not true as well as to hold attitudes that differ from those that they would have held if they had been correctly informed. Ideally, then, a successful correction will alter both beliefs and attitudes, reverting both to the same state as prior to receiving the misinformation. However, not all corrections achieve both goals. Sometimes, a correction fails to alter beliefs at all. And sometimes, it successfully changes beliefs but has no effect on attitudes.
A large body of work, including some discussed in this volume, has taken on the question of how to ensure that corrections reach the people who need them most. This chapter focuses on different questions: what happens after a person receives a correction to a piece of misinformation? When are corrections successful, and when do they fail? Can misinformation continue to affect attitudes even after it is successfully corrected? This chapter begins by outlining why some corrections are more effective than others. Then, we discuss the ways in which even misinformation that is successfully corrected can shape beliefs and attitudes.
When are corrections successful at debunking misinformation?
This section discusses factors that contribute to a corrections success at debunking misinformation. On aggregate, people do tend to move their beliefs in the expected direction when given a correction. A much-publicised 2010 study by Brendan Nyhan and Jason Reifler suggested that under certain circumstances, a correction might ‘backfire’, leading people to double down on their incorrect beliefs. However, more recent attempts to replicate and expand on those findings have demonstrated that in practice, the backfire effect is extremely rare. Wood and Porter (2019) conducted five separate experiments in which respondents were exposed to corrections of 52 different misperceptions. In the substantial majority of cases, corrections moved people closer to the truth. A similar study, conducted in the context of the 2016 US presidential election, showed that exposing people to journalistic fact-checks of false claims made by Donald Trump led them to hold more accurate beliefs. This was true for Trump supporters as well as for the sample as a whole (Nyhan et al. 2019).
However, while corrections may on aggregate move people closer to the truth, they are by no means a panacea. First, while it is not the focus of this chapter, the people most likely to need corrections are often the ones least likely to see them (Guess et al. 2020). And second, not everyone is equally likely to accept a correction. The following section details several factors that affect the likelihood of a correction being accepted. These factors include individual-level characteristics (for example, partisanship) as well as aspects of the correction (for example, whether it includes an image).
Identity and motivated reasoning
When people process a new piece of information (including corrections), they are rarely objective. Rather, their pre-existing beliefs and attitudes shape the extent to which they attend to, process, and believe the new information. This tendency is called motivated reasoning (Kunda 1990; Kraft et al. 2015). The term motivation refers the human tendency to be motivated by two different end goals. The first goal is accuracy — people generally want to hold accurate beliefs and make ‘correct’ decisions. The second goal is directional: people want to defend pre-existing identities and attitudes. The existence of motivated reasoning means that when a piece of misinformation reflects directly on someone’s identity (for example, if it concerns a controversial political issue), it will be more difficult to correct.
Political misinformation can be especially difficult to correct because it is closely tied to people’s political identities. When a piece of misinformation reinforces a person’s partisanship, they are less likely to accept a correction. For example, Ecker and Ang (2019) found that people were less likely to accept corrections of fictitious misconduct of Australian politicians if the corrections were incongruent with their partisanship. The impact of political identity on the effectiveness of corrections has also been found in comparative political settings. In a study conducted shortly after major combat in the Iraq War ended, Lewandowsky et al. (2005) found that participants in countries that supported the war were less likely to accept corrections of Iraq-related misinformation than were participants in countries that were more sceptical of the war.
Beyond political identity, social categories and cultural identities also contribute to biased processing of corrections. Recent research on “cultural cognition’ sheds light on the importance of underlying cultural values in orienting opinion formation and change through social and cognitive processes (Kahan and Braman 2006; Kalian et al. 2007). Motivated reasoning of corrective information not only stems from partisan biases but also results from self-serving biases fuelled by any important identities, core values, or attitudes that people hold strongly. For example, among people with strong anti-vaccine attitudes, debunking false claims on the risks of flu and MMR vaccines can result in decreased intentions to vaccinate themselves or a future child (Nyhan et al. 2014; Nyhan and Reifler 2015). Similarly, when highlighting the Muslim American identity of a religious leader, people are less likely to accept corrections of false news coverage of his claims. This tendency is heightened among those with unfavourable opinions of Islam and high social-dominance orientation (Garrett et al. 2013).