Science and the politics of misinformation

Jeremy Levy, Robin Bayes, Toby Bolsen, and James N. Druckman

The transformation of the communication environment has been a defining feature of the twenty-first century. The emergence of media choice at the end of the twentieth century (e.g. cable news, the internet) and later social media means we now live in a ‘fragmented and polarized media environment’ (Iyengar and Massey 2019). This brings a host of positive consequences, such as easier access to information and worldwide connectivity, but it also introduces new challenges, such as echo chambers in which people evade information contrary to their standing beliefs (Sunstein 2001). Perhaps the most troubling consequence is the increased likelihood of misinformation and fake news: there are no longer trusted gatekeepers who curate information for accuracy, so anyone can claim to be an expert while disseminating falsehoods.

Misinformation has become a global problem that affects all aspects of life and garners much attention in the political sphere. This is in part due to the 2016 US election, when the Russian government created fake social media avatars with names like ‘Blacktivist’ and ‘army_ofjesus’ to stoke partisan outrage, duping millions of Americans into sharing memes about the turpitude of opposing partisans (e.g. Grinberg et al. 2019). Misinformation about science, however, poses a distinct challenge. Science exists to provide systematic knowledge to improve decision-making (Dietz 2013), but the changed media environment has undermined the privileged cultural authority of science by allowing anyone to claim to be ‘scientific’.

There is an urgency to understand and address science misinformation, illuminated most recently by the COVID-19 pandemic. As we write this chapter, the social-scientific community is mobilising to advise political actors about the behavioral challenges posed by COVID-19 (Van Bavel et al. 2020). This includes the major challenge of communicating science to the public. Internationally and within the United States, government leaders differ dramatically in their embrace or disdain of scientific expertise. Early evidence indicates that false information, rumours, and conspiracy theories proliferate through the public. At minimum, this misinformation strains experts’ abilities to communicate to the public and coordinate policy; at worst, it leads individuals to make decisions that are downright dangerous.

In this chapter, we summarise research on scientific misinformation and the challenges it poses to the implementation of government policy. Science is certainly valuable for individual decision-making in personal health and other domains. The challenge here is that science in the public sphere often becomes politicised (Oreskes and Conway 2010), with public attitudes and public policies diverging from scientific consensus on topics such as climate change, genetically modified organisms (GMOs), and vaccines (Flynn et al. 2017; Scheufele and Krause 2019). In what follows, we define the problem of science misinformation and misperceptions, discuss its causes, and review potential antidotes.

Contradicting the best available evidence: definitions and nomenclature

One challenge in studying misinformation concerns the proliferation of terms throughout the literature. We thus hope to offer some conceptual clarity. First, we distinguish between communications and beliefs. Misinformation refers to a communication that is ‘false, misleading, or [based on] unsubstantiated information’ (Nyhan and Reifler 2010: 304). This comes in various guises: rumours, defined as misinformation that ‘acquirefs its] power through widespread social transmission’ (Berinsky 2017: 242—243); fake news, defined as misinformation that ‘mimics news media content in form but not in organizational process or intent’ (Lazer et al. 2018: 1094); and conspiracy theories attributing events to ‘the machinations of powerful people, who attempt to conceal their role’ (Sunstein and Vermeule 2009: 205). Misperceptions, in contrast, are attitudes — they are ‘cases in which people’s beliefs about factual matters are not supported by clear evidence and expert opinion — a definition that includes both false and unsubstantiated beliefs about the world’ (Nyhan and Reifler 2010: 305).

What, then, does it mean to be ‘false’ or ‘not supported by clear evidence’? This is particularly tricky when it comes to science: the evidentiary standard is ambiguous because the scientific method never allows one to prove a hypothesis is correct. We build on Nyhan and Reifler (2010) and Flynn et al. (2017), who differentiate between information or perceptions that are (1) ‘demonstrably false’ — that is, contradictory to objective empirical evidence — and (2) ‘unsubstantiated’ — that is, unsupported by evidence and expert opinion. This distinction may have normative implications as Levy (2020) investigates whether demonstrably false and unsubstantiated misperceptions differ in their prevalence and the extent to which they can be corrected. But when it comes to science, the unsubstantiated standard seems most appropriate, given the impossibility of definitive conclusions. Thus, we define scientific misinformation as a claim made in political communications that is unsupported or contradicted by the scientific community’s best available information (Druckman 2015).

Why should we care if people are misinformed about science? First, it undermines the scientific community’s ability to provide systematic knowledge to ‘help nonscientists make better decisions’ (Lupia 2013: 14048). Further, it can be worse for an individual to be misinformed and hold inaccurate beliefs than to be uninformed and hold no factual beliefs on some topic. When individuals form subjective attitudes from misperceptions, their decisions do not occur randomly but become systematically and deleteriously skewed (Kuklinski et al. 2000: 792—793). On the collective level, ‘misinformation may form the basis for political and societal decisions that run counter to a society’s best interest’ (Lewandowsky et al. 2012: 107).

 
Source
< Prev   CONTENTS   Source   Next >