Zoom Is Memetic Warfare: Zoombombing and the Far Right

Zoombombing and the Far Right

The goal of media scholarship is to help us understand the histories and afterlives of communicative forms. Digital or “new” media presents challenges because of its ephemerality, massive scale, and emergent qualities, requiring us to analyze it while living in it. Memes, user-produced mashup images that combine familiar figures such as cats, cartoons, badly drawn MS Paint images, and text have received particular attention as indispensable and characteristic tools of the right. “The left can’t meme” is itself a meme: a repeatable, memorable short piece of content that has roots in earlier forms and lives within an ecosystem of actors, material artifacts, production cultures, and cultural politics.

In this chapter, we situate zoombombing within the memic architecture of US racial and gender digital politics, examining how the practice serves the right’s purposes even when they are not deployed by the right. Zoombombing deploys classic online harassment techniques and adds something new: it exploits Zoom’s uniquely liminal space, a space of intimacy generated by users via the relationship between the digital screen and what it can depict, the device’s audio tools and how they can transmit and receive sound, the software that we can see, and the software that we can’t.

Two conditions have paved the way for zoombombing: a resurgent fascist movement that has found its legs and best megaphone on the Internet and an often-unwitting public who have been suddenly required to spend many hours a day on this platform redundant and poor grammar. COVID did not permit training, onboarding, or offer advance warning about the dangers of this platform. As this book shows, Zoom benefited from a boom that made it especially attractive to racist actors and racist acts. The Internet has always offered the option of creating community or allowing work to flow into the intimate spaces of the home, but with the global pandemic, these spaces have become intricately and necessarily linked, and

Zoom has stepped up to become a useful and, at the same time, monopolizing force for the facilitation of this overlap. Zoom and other videoconferencing services are both the bad object and the unquestionable infrastructure for social and economic life. Zoombombing repurposes new platforms in dynamic and far-reaching ways; it is both like and unlike older styles of racial and gendered attacks but in the end, is a uniquely COVID-era and neofascist activity.1 Zoombombing sprang into being as a widespread practice because it was able to tap into the frustration and opportunities presented by the US’s culturally polarized society and buoyed by a platform that provides ample opportunities for these particular styles of attacks.

Internet harassment shifts and adapts along with new platforms, growing in parallel with them as they get bigger and more popular. Zoombombing is more than just trolling; though it belongs to a broad category of online behavior meant to produce a negative reaction, it has an intimate connection with online conspiracy theorists and white supremacists. Trolls want an audience, and big platforms produce big audiences, and while trolling is often racist and misogynistic, it can also entail less nefarious tactics and goals, such as rickrolling. Zoombombing should not be lumped into the larger category of trolling, both because the word “trolling” has become so broad it is nearly meaningless at times, and because zoombombing is designed to cause intimate harm and terrorize its targets in distinct ways.

While some forms of zoombombing mimic trickery and mischief that were already present in spaces such as real-life classrooms and town halls, this book analyzes how the Internet’s prior histories of joyful and mischievous pranking are actually hateful harassment. This tactic emerged from online spaces that deploy memetic warfare, and the online ecosystems that radicalize users into increasingly violent actions.

Some of the earliest examples of zoombombing happened in the classroom. Like other online harassment campaigns, zoombombing is nothing if not opportunistic. As described in Chapter 1, these groups leave traces of their active organizing and mobilization on other online platforms such as Twitter and Discord. Their planning focuses on gathering more participants, offering up details about the space that is going to be attacked, and providing the attackers with a set of images, rhetoric, and video tools. At times the instigator will make requests based on the intended target, usually linked to their race and/or gender, but always focused on shock value provided through profanities, racial slurs, and graphic porn. In a Discord chat used to organize a zoombombing raid, one student who volunteered his class’s login code for Zoom later drew the line at showing porn to fourth-grade students.2 There were many other examples of trolling behavior linked to classrooms that intended not only to disrupt but to humiliate or harm the teacher or students by targeting their race and gender. Though there is a wide spectrum of tactics and goals within the larger category of zoombombing, we’ve found that it is most commonly used for racist and gendered attacks and that attackers regularly seek out spaces intended for community and safety.

COVID has reprioritized and reframed everyday acts of living, working, and communicating and has created fertile ground for a regime of seemingly random racial terror. Zoombombing now fits into a growing framework of memetic warfare: platforms such as Discord, Twitter, and 4chan funnel bombers and unwitting users alike into an unpredictable, unmoderated, anonymous, and consequence-free space.

We entitled this book Racist Zoombombing to distinguish our topic from other forms of Zoom disruption that don’t have a racial component. Some instances of zoombombing were intended to disrupt institutional spaces, such as classrooms or government board meetings, and may or may not have used racial and gendered content. We also understand the messiness of that distinction. At times context and content seem antithetical. The content used to disrupt could be racist even if the targets themselves were white, or sexist even when the target was male. This speaks to zoombombing’s memic nature, which draws from the past and present to repurpose a formula and to spread its content as far as possible. Racism and misogyny are endemic to the US cultural context and are thus repurposed for a variety of contexts. We are not claiming that zoombombing is in itself an inherently racist act, but that through its targets or its content, it is used as a form of racial and gendered terror, and that by and large, the targets of these attacks are people of color and women.

Kids disrupt class online just as they did offline, and bored and frustrated people with new time on their hands and justified aggression about being stuck at home seek connection by associating with subcultural groups online that organize around disruption. Because the majority of zoombombings reported in the press involve overtly racist name-calling, pornographic imagery, threats to kill people of color, images of swastikas and KKK regalia and symbolism, and other signs and signifiers, we see this subset of Zoom misuse as both distinctive and as belonging to a more expansive category of proto-fascist content that has been banned on other platforms but appears there frequently nonetheless.

We trace zoombombing’s genealogy back to the memetic warfare waged by the far-right and its use of the same racist signs and images disguised and embedded within more playful ones, such as cartoon character Pepe the Frog. Like certain memes or other tactical appropriations of popular culture, zoombombing is designed to provide users deniability. When zoombombing happens in a group of all white people and bypasses people of color, it overlaps structurally, historically, and semiotically with the far-right even when its perpetrators are not aware of or participating in those movements, which is exactly why we call this memetic warfare. Zoombombing serves the far-right racist movement even when the carriers of this meme who decide to zoombomb for fun aren’t aware of it. A virus doesn’t need a host to believe in it: it just needs a carrier and a population to infect.

The excitement of trolling has much to do with the excitement of emergent behavior on the Internet that Utopians celebrate, of not knowing how far this might go, a tonic for alienated and often legitimately disenfranchised people who cling to this right as other more meaningful forms of social engagement have disappeared or were never present in the first place. The history of trolling is inseparable from racist and misogynistic content and behavior because these are often the tools used to shock and harm, and in this way zoombombing is like other kinds of trolling that leverage novelty and strong effects. While certainly zoombombing can be understood as part of the lineage or ecosystem of trollish behavior, we argue that it needs to be critiqued and understood as more than simply trolling because this term emerged during an earlier, less media-rich and interpersonally live Internet.

Many people have heard the term “red-pilling”3 before, as documentaries such as The Red Pill (2017) and The Brainwashing of My Dad (2015) find their way onto Netflix and journalists use it in articles about the rise of far-right populist movements. The term is drawn from The Matrix film franchise and describes how the memetic Internet recruits users into far-right movements and how digital media content is both an act of community building and a form of propaganda that provides momentum and power to Internet harassment.4 Choosing to take the red pill rather than blue pill awakens one to the “reality” of feminism as a plot against men, liberalism as a way to victimize white people, and diversity initiatives as indoctrination and as “Black supremacy.” Those who take the red pill claim a new awareness of the lies and harm that feminism or multiculturalism have created for society at-large, and the particular harm that white males face as a consequence. To be red- pilled is to be radicalized into a male-supremacist and/or white- supremacist community, to be “awakened.” The term took root in the manosphere/incel communities, quickly spreading to white supremacist spaces. It might mean following “Q” and following the QAnon movement - a conspiracy theory that envisions Democrats as child molesters and has migrated from an online-only space to offline political demonstrations; QAnon signs have been spotted at Trump rallies. To “take the red pill” is to have one’s eyes opened to the harms of feminism, liberalism, or multiculturalism.5 It is to be inducted, ideologically, into a loosely defined subculture that opposes progressiveness or mainstream society.

Red-pilling can be more or less focused on aspects of radical-right thinking, it can be more or less organized toward male-supremacy, or it can be loosely racist or homophobic, all depending on which platform or community the red-pilling takes place in. Red-pilling is attractive to resentful and angry converts because it rewards and requires action. This can be as simple as sharing your conversion story on a Reddit forum, making a racist meme “for the lulz,” or zoombombing a meeting using one of these memes. Red-pilling is fundamentally about creating noise, violence, and harm.6 These communities and subcommunities’ tactics and tools are so malleable and widespread that they have infiltrated the broader culture.7

Those that participate in this particular type of antagonism, trolling, or general harassment are taking part in a culture of red-pilling, whether or not they realize it or actively engage with spaces more readily understood as extremist.8 The Overton Window, or the indicator of what is publicly acceptable in social discourse, has shifted so far to the right that it is now not only possible but absolutely the norm that zoombombing is viewed as a relatively harmless prank compared to the plethora of other less-ephemeral and longer-duration overtly racist and misogynistic content and behavior online.

The term and ideologies behind red-pilling have become so widespread that it doesn’t actually require intentional recruitment for someone to “fall down the rabbit hole” to engaging with the same ideologies and tactics that the more radical online spaces traffic in. As some researchers have pointed out, there is such a plethora of misinformation and online hate speech that users can effectively “red-pill” themselves.9 One of the core tactics used by recruiters is to approach racist or misogynistic material as “trolling” or, as Ryan Milner calls it, the “Logic of the Lulz,” which deploys an attempted plausible deniability to acts of harassment or overt ideological terrorism.10 This was a key element in early online harassment, and zoombombing has refined it for the COVID age.

Trolling is both an act of aggression meant to situate the victim as unwanted and unwelcome, as well as a way to codify shared values for the subcultural group that deploys these tactics. It defines the target as the “other,” and in doing so, it helps strengthen the internal characteristics of the community. Zoombombing conceals and contains the terror and psychological harm that targets of active harassment face because it doesn’t leave a trace unless an alert user records the meeting. Likewise, zoombombing is articulated, most commonly, through a focus on anti-Blackness. Even in instances where a Black subject is not present, anti-Black imagery and language is often deployed. This core element to zoombombing speaks to the longer history of the United States and how structures of power, subjugation, and race are articulated through white supremacy and anti-Blackness. Similarly, it also showcases the racialized history of Internet culture and the power dynamics of historic harassment.

Zoombombing is the latest iteration of a much longer history of loosely organized, highly effective, memetic campaigns by subcultural groups that often have violent real-world aftereffects. Zoombombing, like other kinds of bombing, induces terror through targeted violence against an enemy other, often in gendered and racialized terms but operating under the cover of the impersonal. And yet, racism and sexism are always personal; they are an attack on a person’s very being and identity. Even the most innocuous cultural objects are able to be easily weaponized, such as Pepe the Frog, whose origins were apolitical but is now recognized as a hate symbol.11 Pepe has for years been used in memetic warfare campaigns, to harass and threaten targets through racist and misogynistic humor. As a cultural symbol, he signals to a loose membership of networked communities or ideologies that users can tap into when useful. And yet, he is often used to claim an updated version of “the logic of the lulz,” which separates intent from action.

In much of the commentary we found on online platforms used to organize these attacks, participants framed their use of zoombombing similarly. They saw their actions as humorous and claimed in the comments to not understand why people reacted as strongly as they did - why they didn’t get the joke - even as it was clear that the violence of the act and the shock was always the point. The culpability for these actions, and the resulting harm, fall on the perpetrators but also on the infrastructures that continue to support and allow zoombombing to take place. Earlier this year, despite the increased evidence of these attacks, Zoom continued to maintain that these acts were simply “party-crashing” and not part of an organized campaign of hate. While their language has changed after organizing by those affected, there continues to be a lack of responsibility and support.

The spreadability of memes such as Pepe or zoombombing allows for a variety of uses. Not every Pepe image is deployed in a racist way, and not every user of Pepe is a white supremacist.12 Context does matter. Nonetheless, zoombombing and Pepe are both part of a larger contextual framework that has effectively mobilized and empowered hate campaigns. Zoombombing is itself a meme.13 As with Pepe, zoombombers derive social capital in their usage through constructs of violence, and memes and memetic warfare are an important part of the accumulation of social capital. Limor Shifman calls Internet memes “units of popular culture that are circulated, imitated, and transformed by Internet users, creating a shared cultural experience.”14 This shared experience, however, takes on a particular racialized and gendered meaning when viewed through the longer history of the fight to keep the Internet a white, male space. Zoombombing, then, is simply a newer version of this longer struggle. It deploys the same tactics through a new venue. Racist and misogynistic language and imagery have long been used as a guerilla-warfare tactic to push unwanted individuals out of a digital space through fear, disgust, or discomfort. Though new platforms look different, these goals and tactics remain the same.

Memes are the improvised explosive device (IED) of information warfare.15 We are well in the midst of a digital cultural war based on information and data, rather than weapons and bodies, and memes and other elements of the far-right’s political aesthetic play a key role in this conflict.16 The rise of the online troll as a political player and the alt-right are merely the logical outcomes of these systems.17 The term “zoombombing” is inherently violent, invoking terrorist and warlike tactics. It is no coincidence that these martial metaphors are so common in online spaces as culture wars are largely waged online, and like explosives, memes are not precise weapons. They are not easy to control once circulated on social platforms and can harm more than the intended target.

Memetic warfare isn’t always racist; some of the most effective and widespread anti-racist campaigns during COVID have also been memetic. Memes can be a genuine form of resistance against propagandized rhetoric by powerful institutions and can be used by victims to disrupt systems of power and harm. BTS, a popular К-Pop group, whose fans, also known as “Army” members, have appropriated military language to a very different end, and have coordinated highly successful attacks against white supremacy hashtags by creating their own memes, posting thousands of fancams of group members.18 This kind of memetic warfare can push back against hate groups or carceral regimes with stark examples of opposing experiences. It can draw together the masses to disrupt power.

For example, though the #myNYPD campaign was designed to collect promotional material meant to showcase positive interactions with police officers, users flooded it with thousands of images of police brutality, eventually spreading nationally to include similar hashtag campaigns such as #myLAPD.19 A more current example of memetic warfare comes from the 2020 campaign trail when К Pop stans, fans of particular Korean Pop groups, rallied together to take over the hashtag #whitelivesmatter, an inherently white supremacist pushback to the growing Black Lives Matter movement.20 Or when thousands of teens on TikTok waged multiple attacks against the Trump reelection campaign through negative reviews on Trump’s reelection app;21 TikTokers and KPop stans claimed credit for tanking the numbers at Trump’s Tulsa Oklahoma rally in June.22

However, while these examples are lauded as highly impactful, most campaigns “from the other side” are rarely as effective compared to those “on the right.” One key to the right’s effectiveness is the straightforwardness of their goals and their allergy to nuance. Those that take part in campaigns of targeted harassment want to create havoc, confusion, and harm. They do so by using the most direct and explosive tools available to them. Zoombombers enter a space meant for joy, intimacy, celebration, or collaboration, and disrupt and shock its target audience through violent imagery or language. Making the targets feel unsafe and unwanted in their space and their skin is the most simple yet effective strategy. Early trolling culture claimed to be apolitical, targeting a spectrum of advisories with the common goal of showcasing a nihilistic, trickster aesthetic.23 And yet, even then misogyny and racism, in particular anti-Black racism, was a core tactic in those attacks and harassment campaigns.

Zoombombing is simply one of the latest memetic weapons used against people in precarious social positions. The US military considers memetics a subset of neurocognitive warfare and as a tool in

“information war.” Although memetic warfare is often understood through a focus on state against state tactics, it also refers to spaces where online self-designated “meme warriors” have launched targeted attacks against a cultural enemy group in a variety of organized and disorganized ways. Meme warriors see themselves as digital guerrilla fighters against institutional monopolies on knowledge and narratives, such as the mainstream media and other centralized authorities.24 Gamergate, “the Fappening” (or Celebgate), and the subsequent Comicsgate are examples of campaigns that deployed active memetic warfare in racialized and gendered ways to attack and drive away a constructed enemy.25

Like memes, zoombombing operates under the moral cover of humor, yet as we have found from speaking to Dr. Tiara Moore, Angelique Herring, and Dr. Dennis Johnson, hearing the “N” word shouted at you during your dissertation defense is a form of informational and psychological warfare, and zoombombing, like other memetic warfare campaigns has grown naturally, and asymmetrically, across multiple platforms. All users are vulnerable by design on Zoom, but the dangers are not evenly distributed. It is truly inspiring to see how users are creating sacred and nurturing spaces on Zoom by offering free yoga and boxing classes; holding prayer groups and meditation sittings; conducting funerals, weddings, and graduations on video; and saying their last goodbyes to loved ones with COVID as they pass from this life alone in hospital beds. It is exactly because Zoom is a lifeline to community and intimacy that Black life is particularly targeted there. As previous memes such as Barbecue Becky and the driving, walking, or standing while Black catchphrases have demonstrated, the sight of Black joy or public life enrages and disturbs whiteness and those white folks who feel their privilege is threatened.

At its best, the Internet networks and connects individuals and allows them to create new forms of knowledge, community, and intimacy. When used as a space for joy, or to organize to disrupt power and oppression, memes and other emergent digital practices can be creative, collaborative, and a force for good. However, Zoom is often used to support and perpetuate harm and, for better or for worse, has become both a battleground and the COVID era’s site of connection for work, for family, and for community connection. And like other styles of warfare, those that are most targeted and most harmed are the ones that live in a state of precarity already. Racism can be separated neither from our understanding of technology nor from cultural movements. Zoombombing is simply the most recent iteration of the culture wars played out online.

  • 38 Zoom Is Memetic Warfare Notes
  • 1 Steinbeck, “Virtual UGA Guest Lecture Hijacked with Death Threats, Racial Slurs Directed toward Professors.”
  • 2 Kan, “Students Conspire in Chats to ‘Zoom-Bomb’ Online Classes, Harass Teachers.”
  • 3 “What the Red Pill Means for Radicals”; Lewis, “The Online Radical- ization We’re Not Talking About.”
  • 4 “What the Red Pill Means for Radicals.”
  • 5 “What the Red Pill Means for Radicals.”
  • 6 Cunha, “Red Pills and Dog Whistles.”
  • 7 Phillips, “The Oxygen of Amplification.”
  • 8 Crawford, “The Influence of Memes on Far-Right Radicalisation.”
  • 9 “VasilistheGreek (Discord ID."
  • 10 Milner, “FCJ-156 Hacking the Social.”
  • 11 Morlin, “Pepe Joins (((Echoes))) as New Hate Symbols.”
  • 12 Chan, “Intimacy, Friendship, and Forms of Online Communication among Hidden Youth in Hong Kong.”
  • 13 What is a meme? According to Lirnor Shifman memes are “(a) a group of digital items sharing common characteristics of content, form, and/ or stance, which (b) were created with awareness of each other, and (c) were circulated, imitated, and/or transformed via the Internet by many users,” Shifman, Memes in Digital Culture, 367.
  • 14 Shifman, Memes in Digital Culture, 367.
  • 15 Siegel, “Is America Prepared for Meme Warfare?”
  • 16 Bogerts and Fielitz, “Do You Want Meme War?”
  • 17 Fichman and Sanfilippo, Online Trolling and Its Perpetrators-, Graham, “Boundary Maintenance and the Origins of Trolling”; Greene, “‘Deplorable’ Satire”; Hodge and Hallgrimsdottir, “Networks of Hate.”
  • 18 “К-Pop Fans Drown out #WhiteLivesMatter Hashtag.”
  • 19 Lopez, “Twitter Critics Take on LAPD after NY Police Hit on Social Media.”
  • 20 Ohlheisier, “How К-Pop Fans Became Celebrated Online Vigilantes.”
  • 21 Banjo and Egkolfopoulou, “TikTok Teens Are ‘Going to War’ Against the Trump Campaign After Republicans Call to Ban the App.”
  • 22 Lorenz, Browning, and Frenkel, “TikTok Teens and К-Pop Stans Say They Sank Trump Rally.”
  • 23 Phillips, Beyer, and Coleman, “Trolling Scholars Debunk the Idea That the Alt-Right’s Shitposters Have Magic Powers.”
  • 24 #Gamergate is a good example of a memetic campaign against women in gaming that figured itself as an insurgent protest of a larger and more powerful entity—the gaming journalism establishment.
  • 25 Massanari, “#Gamergate and The Fappening.”


Banjo, Shelly, and Misyrlena Egkolfopoulou. “TikTok Teens Are ‘Going to War’ Against the Trump Campaign After Republicans Call to Ban the App.” Time, July 10, 2020. https://time.com/5865261/ tiktok-trump-campaign-app/.

Bogerts, Lisa, and Maik Fielitz. ‘“Do You Want Meme War?’: Understanding the Visual Memes of the German Far Right.” (2019): 137-153. https://doi.org/10.14361/9783839446706-010.

Centre for Analysis of the Radical Right. “What the Red Pill Means for Radicals.” June 8, 2018. https://www.radicalrightanalysis.


Chan, Gloria Hongyee. “Intimacy, Friendship, and Forms of Online Communication among Hidden Youth in Hong Kong.” Computers in Human Behavior 111 (October 2020): 106407. https://doi.Org/10.1016/j. chb.2020.106407.

Cunha, Darlena. “Red Pills and Dog Whistles: It Is More than ‘Just the Internet.’” Aljazeera, September 6, 2020. https://www.aljazeera. com/opinions/2020/9/6/red-pills-and-dog-whistles-it-is-more-than- just-the-internet/.

Crawford, Blyth. “The Influence of Memes on Far-Right Radicalisation.” Centre for Analysis of the Radical Right (blog), June 9, 2020. https:// www.radicalrightanalysis.com/2020/06/09/the-influence-of-memes- on-far-right-radicalisation/.

Fichman, Pnina, and Madelyn R. Sanfilippo. Online Trolling and Its Perpetrators: Under the Cyberbridge. Lanham, MD: Rowman & Littlefield, 2016.

Graham, Elyse. “Boundary Maintenance and the Origins of Trolling.” New Media & Society, May 30, 2019. https://doi.Org/10.l 177/1461444819837561.

Greene, Viveca S. “‘Deplorable’ Satire: Alt-Right Memes, White Genocide Tweets, and Redpilling Normies.” Studies in American Humor 5, no. 1 (2019): 31-69. https://doi.Org/10.5325/studamerhumor.5.l.0031.

Hodge, Edwin, and Helga Hallgrimsdottir. “Networks of Hate: The Alt- Right, ‘Troll Culture’, and the Cultural Geography of Social Movement Spaces Online.” Journal of Borderlands Studies 35 (February 26, 2019): 1-18. https://doi.org/10.1080/08865655.2019.1571935.

“К-Pop Fans Drown out #WhiteLivesMatter Hashtag.” BBC News, June 4, 2020, sec. Technology, https://www.bbc.com/news/technology- 52922035.

Kan, Michael. “Students Conspire in Chats to ‘Zoom-Bomb’ Online Classes, Harass Teachers,” n.d. https://www.pcmag.com/news/students- conspire-in-chats-to-zoom-bomb-online-classes-harass-teachers.

Lewis, Alice Marwick, Becca. “The Online Radicalization We’re Not Talking About.” Intelligencer. Accessed November 1, 2020. https:// nymag.com/intelligencer/2017/05/the-online-radicalization-were-not- talking-about.html.

Lopez, Robert J. “Twitter Critics Take on LAPD after NY Police Hit on Social Media.” Los Angeles Times, April 23, 2014, sec. California. https://www.latimes.com/local/lanow/la-me-ln-twitter-critics-mylapd- mynypd-20140423-story.html.

Lorenz, Taylor, Kellen Browning, and Sheera Frenkel. “TikTok Teens and К-Pop Stans Say They Sank Trump Rally.” The New York Times, June

21, 2020, sec. Style, https://www.nytimes.com/2020/06/21/style/tiktok- trump-rally-tulsa.html.

Massanari, Adrienne. “#Gamergate and The Fappening: How Reddit’s Algorithm, Governance, and Culture Support Toxic Technocultures.” New Media & Society 19, no. 3 (March 2017): 329-346. https://doi. org/10.1177/1461444815608807.

Milner, Ryan M. “FCJ-156 Hacking the Social: Internet Memes, Identity Antagonism, and the Logic of Lulz. | The Fibreculture Journal: 22.” The Fibrecultire Journal, no. 22 (2013). http://twentytwo.fibreculturejournal. org/fcj-156-hacking-the-social-internet-memes-identity-antagonism- and-the-logic-of-lulz/.

Morlin, Bill. “Pepe Joins (((Echoes))) as New Hate Symbols.” Southern Poverty Law Center, September 28, 2016. https://www.splcenter.org/ hatewatch/2016/09/28/pepe-joins-echoes-new-hate-symbols.

Ohlheisier, Abby. “How К-Pop Fans Became Celebrated Online Vigilantes.” MIT Technology Review, June 5, 2020. https://www.technologyreview. com/2020/06/05/1002781/kpop-fans-and-black-lives-matter/.

Phillips, Whitney. “The Oxygen of Amplification.” Data & Society, May

22, 2018. https://datasociety.net/library/oxygen-of-amplification/.

Phillips, Whitney, Jessica Beyer, and Gabriella Coleman. “Trolling Scholars Debunk the Idea That the Alt-Right’s Shitposters Have Magic Powers.” Vice, March 22, 2017. https://www.vice.com/en/article/z4k549/ trolling-scholars-debunk-the-idea-that-the-alt-rights-trolls-have- magic-powers.

Shifman, Limor. Memes in Digital Culture. Cambridge, MA: The MIT Press, 2013.

Siegel, Jacob. “Is America Prepared for Meme Warfare?” Vice, January 31, 2017. https://www.vice.com/en/article/xyvwdk/meme-warfare.

Steinbeck, Foster. “Virtual UGA Guest Lecture Hijacked with Death Threats, Racial Slurs Directed toward Professors.” The Red and Black. Accessed November 2, 2020.

Unicorn Riot: Discord Leaks. “VasilistheGreek (Discord ID: 270328712367570955).” Accessed November 1,2020. https://discordleaks. unicornriot.ninja/discord/user/1445.

< Prev   CONTENTS   Source   Next >