Misogyny as core logic of misinformation campaigns

Extreme-right movements use misogyny as a core logic to their politics and their misinformation campaigns; it is not merely a strategy or tactic, but rather, these movements are frequently based on misogyny as a set of discourses and practices that aim to ‘reset’ the gender balance back to its ‘natural’ patriarchal relation (Banet-Weiser 2018). Within these movements, the need for feminist politics (in a similar way as the need to dismantle systemic racism) is positioned as misinformation. While the racist ideologies of the extreme right have often been correctly identified as white nationalism, the extreme right has always also run on an overtly misogynistic agenda; as Matthew Lyons points out, ‘Harassing and defaming women isn’t just a tactic; it also serves the alt-right’s broader agenda and long-term vision for society’(2016, para. 8, emphasis added). A key strategy of the extreme right is recuperation: men’s rights organisations in digital culture are filled with false campaigns about how women and feminists have not just destroyed society but emasculated it.

The gendered logic of misogynistic misinformation campaigns is that of a zero-sum game: men lose, become invisible, when women win and become more visible. Conservative populist movements take a particular shape in a contemporary crisis in hegemonic masculinity, a crisis brought on by global economic collapse (as men lose their jobs and future security), by more visible efforts to diversify workplaces and cultural spaces (exemplified by a few visible successful women in technology fields), and by increasing popular feminist activism. Within this crisis, some men (particularly white working- and middle-class men) see themselves as losing cultural and political ground, relinquishing patriarchal authority (Rosin 2013; Negra and Tasker 2014; Banet-Weiser 2018). Within the context of this existing sense of injury and loss, feminists’ call for more equity is framed as dangerous misinformation. Women, and specifically feminism, are assumed as the reason for this loss and are targets for misinformation campaigns. Consequently, a normalised misogyny is often the price women pay for being visible online, with digital platforms such as Twitter and Facebook doing little to monitor misogynistic misinformation campaigns.

These misinformation campaigns have been directed particularly intensely at black women, as part of what black feminist scholar Moya Bailey has called ‘misogynoir’, the specific targeting of black women for misogynistic and racist abuse (Bailey 2020). One of the earlier examples of how misinformation was used against black women online came in 2013, when a series of misinformation campaigns were circulated on Twitter (Broderick 2014; Diop 2019; Hampton 2019; Donovan 2019). These Twitter campaigns were initially launched through false hashtags that pretended to come from black women: specifically #EndFathersDay. An elaborate hoax, #EndFathersDay was started by anonymous trolls on 4chan to simulate feminist outrage at the idea of having a national holiday for fathers, claiming that Father’s Day was a symbol of patriarchal oppression. As Donovan points out, ‘To grab attention, these trolls relied on the social norms of trusted self-identification (“I am a black feminist”) alongside feminist support strategies (“listen to women of color”)’ (Donovan 2019) Not surprisingly, conservative media pundits fell for the hoax, amplifying their critique of feminists, especially black feminists (Hampton

2019). This misinformation campaign was exposed by actual black feminists, particularly Twitter users Shafiqah Hudson and I’Nasah Crockett, who signaled the fabricated tweets with the hashtag “yourslipisshowing” as a way to make others aware that these tweets were intended to pit black women against each other. However, this kind of mimicry can never be complete. The notion that the ‘slip is showing’ is explicitly about how misinformation will never quite bamboozle those who are in possession of the ‘real’information (who know how to hide the proverbial slip). Consequently, it’s also a way of calling out the fact that this misinformation campaign does not imagine black women as the recipients at all, but rather white men and women.

#EndFathersDay and other faux black feminist accounts did not receive the kind of international attention that other misinformation campaigns did. As reporter Aremita Diop points out,

Even before the Russian Internet Research Agency weaponized these tactics for the 2016 election, anonymous 4chan users spread #EndFathersDay through false-flag Twitter accounts, posing as black women to exacerbate fissures between feminists of color and white feminists as well as rile up conservative pundits. But few outside of the online community of black women realized at the time that this was a coordinated operation.

(Diop 2019)

As Ryan Broderick reports, #EndFathersDay was part of a larger Men’s Rights Activist effort called ‘Operation Lollipop’, in which ‘the idea is to pose as women of color on Twitter and guide activist hashtags as a way to embarrass the online social justice community’ (Broderick 2014). Other early online fake outrage campaigns, such as #WhitesCantBeRaped, also emerged from 4chan (specifically, the site’s politically incorrect message board, /pol/), in an effort to outrage feminists and make a mockery out of feminist online campaigns.

But Twitter campaigns of misinformation capitalised on well-established structures of racism and sexism; well before the current preoccupation with the crisis of misinformation in the digital sphere, women and people of colour have been the targets of what could be called misinformation campaigns. In other words, using fake accounts to encourage feminist activists to turn against each other was an early iteration of the relationship between misogyny and misinformation, yet like many other subsequent misinformation campaigns that target women, these faux black feminist accounts did not warrant the same kind of attention that others in the ‘post-truth’ era have received. Despite the important feminist activism that emerged from the misinformation campaign, exemplified by the #yourslipisshowing campaign, the tactics used by 4chan and other extreme-right online spaces in campaigns such as #EndFathersDay demonstrated the power of such manipulation and provided what media scholar Joan Donovan calls the ‘blueprint’ for other misogynistic misinformation campaigns (Donovan 2019). They were also quite successful in galvanising right-wing rage online.

One of the most significant misogynistic misinformation campaigns in the digital mediascape was #GamerGate. In August 2014, a relatively small group of mainstream male gamers and social media users began to use the #GamerGate hashtag; their purported purpose was ostensibly legitimate — to register their objection to questionable journalistic ethics. That purpose, however, was a misogynistic ruse for challenging the visibility of women in the gaming world; Gamergaters were primarily concerned with a few increasingly prominent women in this world, whom they labelled social justice warriors: Anita Sarkeesian, Brianna Wu, and Zoe Quinn.

Gamergate began with a misinformation campaign: an aggrieved ex-boyfriend of Zoe Quinn posted a 6,000-word screed, claiming that Quinn, a game developer, slept with gaming journalists in return for good coverage. Though it was quickly demonstrated that this was a false claim, this misinformation, as Charlie Warzel in the New York Times puts it,

spiraled into an online culture war, ensnaring female gaming critics like Anita Sarkee-sian and other designers like Brianna Wu who would suffer months of relentless abuse on and offline. What started as a revenge post over a break-up morphed into Gamergate: a leaderless harassment campaign to preserve white male internet culture, disguised as a referendum on journalism ethics and political correctness, which ran amok.

(Warzel 2019)

As several scholars have pointed out, Gamergate functioned as a kind of ‘rehearsal’ for what is now a normalised online culture of misogynistic harassment based on misinformation. As Warzel continues, Gamergate ‘was a rallying cry’. And it achieved its goal, in terms of ‘intimidating women, deceiving clueless brands and picking up mainstream coverage taught a once-dormant subculture powerful lessons about manipulating audiences and manufacturing outrage’ (Warzel 2019). The idea that Gamergate as a misinformation campaign was a ‘rehearsal’ for politics is telling as it was, at its core, a misogynistic movement (Marwick and Lewis 2015; Massanari 2017). As Marwick and Lewis explain, ‘“Gamergater” has become shorthand for a particular kind of geek masculinity that feels victimized and disenfranchised by mainstream society, particularly popular feminism’ (Marwick and Lewis 2105). The ease with which bots, actual individuals, and websites could circulate misinformation about women and feminists within Gamergate reveals the entwined relationship between misogyny and misinformation. The legacy of Gamergate is that it provided a blueprint for how to wage misogynistic misinformation wars, as well as providing guidelines for more general misinformation campaigns mobilised by the extreme right.

Gamergate was successful as a misinformation campaign because it was allowed to proliferate unchecked and unregulated by media platforms, with the masculinist world of tech infrastructure on stand-by as supposedly objective observers. As many scholars have noted, the fact that social media platforms did nothing to curtail or prevent the continued abuse of women in Gamergate set the stage for what is now a broad digital environment that routinely uses misogynistic misinformation campaigns to control and discipline women. And to return to the notion that the current media environment of misinformation is positioned as a ‘crisis’ of truth, when #EndFathersDay or #GamerGate was happening, and when black women and women in tech called attention to these misogynistic campaigns, media companies either ignored or dismissed them. Misogyny is not seen to be a contemporary ‘crisis’, perhaps because it has existed as part of the structural environment for centuries; it is often invisible as ‘hate speech’ because it is so deeply structural.

Numerous other examples of misogynistic misinformation campaigns have occurred in the years since Gamergate. One of the most recent tactics is ‘deepfakes’, a technology of altering video from the original to a ‘fake’ copy and passing it off as authentic? Deepfakes, an Al-assisted technology, are important for thinking about the future of misinformation campaigns as deepfakes raise pressing questions about consent and how we consume visual information (Cole 2019). Historically, video has been what reporter Samantha Cole called ‘the gold standard of believability’, where what one sees on video is taken as what is, an authentic and true depiction of something that happened. But this tactic also has misogynistic practices as its origin story; as Cole reminds us, ‘When Redditors started using Al to attach celebrities’ faces to porn performers’ bodies, the media reaction focused on the implications for potential political hoaxes, but we need to focus on the women they harmed’ (Cole 2019).

The tactic of deepfakes is seen to have originated in a misogynistic campaign, in which a Reddit user named ‘deepfake’ imposed actress Gal Gadot s face onto the body of a woman in a pornographic film and then widely circulated the video. Indeed, the original use of deepfakes, and what remains one of its most common uses, involves swapping a cis-gender female celebrity’s face onto a porn actress (Paris and Donovan 2019). These kinds of deepfakes remain ‘particularly troubling, primarily for its reification of women’s bodies as a thing to be visually consumed, here completely circumventing any potential for consent or agency on the part of the face (and bodies) of such altered images’(Wagner and Blewer 2019). These deepfakes are clearly examples of misogynistic campaigns with misinformation and lack of consent as their objectives; indeed, the non-consensual exploitation of the deepfake creators is itself part of the logic of the technology, which works to objectify and use women — indeed, to own women’s bodies.

 
Source
< Prev   CONTENTS   Source   Next >