The correction worked - what next?

Even when a correction is fully accepted — in other words, when a person believes the correction and understands that the misinformation is false — its effects can linger. This section focuses on how successfully corrected misinformation can shape beliefs, attitudes, and trust.

The effect of misinformation on attitudes can persist

Even when corrective information successfully updates people’s beliefs, it may not be equally effective at influencing attitudes that were influenced by the misinformation. Thorson (2016) refers to these lingering attitudinal effects as ‘belief echoes’, presenting experimental evidence that even when a correction successfully reverts beliefs back to pre-misinformation levels, attitudes are still swayed by the retracted misinformation (Thorson 2016). The ‘belief echoes’ effect has also been replicated in contexts where accurate belief updating failed to produce downstream effects on favourability towards Donald Trump (Nyhan et al. 2019) or attitudes towards immigration (Hopkins et al. 2019).

Corrected misinformation also continues to shape attitudes and opinions in non-political contexts. It affects behavior, memory, and opinion through a process sometimes called the ‘continued influence effect’ (Ecker, Lewandowsky, and Apai 2011; Ecker, Lewandowsky, Swire, et al. 2011). Across a wide range of topics and outcomes, the results are consistent: even when someone accepts a correction, it does not always fully ‘un-ring the bell’ of misinformation.

There are several mechanisms for this continued influence effect. First, people sometimes engage in attitude-congruent motivated reasoning. Even when their misperceptions are corrected, they ‘explain away’ this new information by looking for other arguments that help align the uncongenial information with their preferred worldview. One such strategy is biased attribution of blame: even when partisans accept facts about the changes in economic conditions, they rationalise the information by blaming the opposing party for the worsening conditions and praise their own party for the improvements (Bisgaard 2019). People may also explain away the uncongenial information with biased credibility judgments, expressing their displeasure by concluding that the source of the uncongenial information is not credible after all (Khanna and Sood 2018).

Second, sometimes the misinformation becomes part of a person’s ‘mental model’ of a particular event and thus becomes more difficult to dislodge (Ecker et al. 2015). For example, when retracting misinformation about a fictitious warehouse fire (that it was due to negligence with volatile materials), the retraction was not effective in reducing reliance on the misinformation unless it offered an alternative explanation (evidence of arson was found elsewhere) (Johnson and Seifert 1994). Because alternative explanations help people to revise their mental models, they can be more effective than simple retractions at reducing the continued influence effect (Walter and Murphy 2018).

Finally, a piece of misinformation can carry an ‘affective charge’ that shapes a person’s emotional reactions to the object of the misinformation in a way that even a successful correction cannot fully eliminate (Sherman and Kim 2002). Lodge and Taber (2005) call this ‘hot cognition’ and found that the affective charge attached to a socio-political concept can be activated within milliseconds of exposure, much faster than the cognitive evaluation of the concept. Further, people had a more difficult time processing affectively incongruent information (e.g. cockroach — delightful) than affectively congruent information (e.g. cockroach — disgusting), which implies that corrections that run counter to one’s automatic affective responses may be less effective.

Unintended consequences of the media’s focus on corrections

Finally, it is worth noting that the intense media focus on fact-checking and misinformation may have additional unintended consequences. When people are repeatedly exposed to corrected misinformation, they may infer a larger lesson: that the information environment is a dangerous and difficult to navigate place (Wenzel 2019). Some empirical evidence gives credence to this concern. For example, an intervention such as a general warning about misinformation on social media, despite reducing peoples beliefs in false news headlines, also reduced their beliefs in true headlines (Clayton et al. 2019). This spillover effect can be also seen in similar interventions such as providing tips on detecting misinformation (Guess et al. 2019b), although in both types of interventions, the effect size of reducing beliefs in true headlines is substantially smaller than the effect size of reducing beliefs in false ones. Further, reading corrections in which journalists adjudicate factual disputes can also reduce the confidence in one’s ability to find truth in politics among those who are less interested in the topic under dispute, raising normative concerns over the unintended effect on political efficacy (Pingree et al. 2014).

 
Source
< Prev   CONTENTS   Source   Next >