Directions for future research

Table of Contents:

Both the earlier discussion and the Christchurch shooting show that the interplay between extremism and information disorders seems to be cyclic. Against a backdrop of globalisation and immigration, societal and political polarisation is on the increase. In turn, this climate of societal polarisation feeds into an increasing state of intergroup tensions, conflicts, and intolerance. Consecutively, these increased levels of intergroup conflict are a breeding ground for violent extremist attacks such as in Christchurch. These attacks set in motion a chain reaction of similar attacks everywhere around the world, leading in turn to increased societal polarisation. And the cycle starts over again. At the epicentre of this global cyclic movement is the information ecosystem and, specifically, the increasing spread of information disorders.

In that sense, this cyclic pattern is similar to the ‘flywheel hypothesis’ of extremism (see Frissen 2019, 89—93). This hypothesis states that such a cyclic chain of events is much like a mechanical flywheel, in as much as information disorders provide the initial energy supply to get the cycle in motion. At the same time, they provide additional kinetic energy to keep it going. The stronger the driving force, the more kinetic energy is built up in the cyclic process and the more inertia the flywheel possesses. This metaphor also implies that even if the driving force is briefly taken away, the flywheel will remain in motion for a while. As a result, it is through the driving forces of information disorders that the flywheel builds up kinetic energy and keeps turning.

A consequence of this hypothesis is that if we wish to study phenomena such as extremism — including the extreme right — we need to approach it from a ‘bird’s eye’ perspective. Current research lacks a holistic approach enabling a deeper understanding of the creation, dissemination, and impact of information disorders, as well as the combined roles of interpersonal and mediated types of communication. Most research about the extreme right has taken either a theoretical approach (e.g. ‘What is it like?’) or a quantitative perspective mainly aimed at the sources (e.g. ‘Who follows it, and how is it spread?’). However, we know very little about the way the target audiences — that is, the users of this contentious and socially unacceptable content — actually define and make sense of this kind of content. Since it has often been argued that fake news and disinformation are ‘in the eye of the beholder’, there is a crucial need for additional research on people’s own understanding of contentious content. We need to better understand the social-psychological characteristics of vulnerable individuals (both as target audience and as subject of the contentious content) and set up initiatives able to make people more resistant to extreme-right disinformation. At the same time, increased scientific attention is needed vis-à-vis the role of digital platforms and the increasing dominance of algorithms in the information ecosystem. For this kind of research, scholars may want to include predictive modelling, forecasting, and computational methods (such as agent-based models).


1 The actual scientific and analytical meaning of the term ‘fake news’ evaporated almost overnight after its introduction by Craig Silverman (2016) and its appropriation by US president Donald Trump, who rightly saw it as a powerful weapon against critical journalists and media. A thorough discursive analysis of what fake news exactly is goes beyond the scope of this chapter. For such an analysis, see Farkas and Schou (2018).


Aristotle, Ross, D. and Brown, L. (2009). The Nicomachean ethics. New York: Oxford University Press, doi: 10.1017/UPO9781844653584.004.

Awan, I. (2016). Islamophobia on social media: A qualitative analysis of the Facebooks walls of hate. International Journal of Cyber Criminology, 10(1).

Barbera, P. (2015). Birds of the same feather tweet together: Bayesian ideal point estimation using Twitter data. Political Analysis, 23, 76—91.

Bauman, Z. (1998). Globalization: The human consequences. Cambridge: Polity Press.

Bayer, J., Bitiukova, N., Bard, P., Szakacs, J., Alemanno, A. and Uszkiewicz, E. (2019). Disinformation and propaganda — impact on the functioning of the rule of law in the EU and its member states. Available at: www. (Accessed: 30 April 2020).

Ben-David, A. and Matamoros-Fernandez, A. (2016). Hate speech and covert discrimination on social media: Monitoring the Facebook pages of extreme-right political parties in Spain. International Journal of Communication, 10, 1167—1193.

Benkler, Y, Faris, R. and Roberts, H. (2018). Network propanda: Manipulation, disinformation, and radicalization in American politics. New York: Oxford University Press.

Bennett, W. L. and Livingston, S. (2018). The disinformation order: Disruptive communication and the decline of democratic institutions. European Journal of Communication, 33(2), 122—139. doi: 10.1177/0267323118760317.

Bogerts, L. and Fielitz, M. (2019). Do you want a meme war?”: Understanding the visual memes of the German far right. In Fielitz, M. and Thurston, N. (eds.), Post-digital cultures of the far right: Online actions and offline consequences in Europe and the US. Bielefeld: Transcript.

Bdrzsei, L. K. (2013). Makes a meme instead: A concise history of Internet memes. New Media Studies Magazine, 7, 1—25. Available at: https://works.bepress.eom/hnda_borzsei/2/.

Boudana, S., Frosh, P. and Cohen, A. A. (2017). Reviving icons to death: When historic photographs become digital memes. Media, Culture & Society, 39(8), 1210—1230.

Bruns, A. (2019). Are filter bubbles real? Hoboken, NJ: John Wiley & Sons.

Bruns, A. and Highfield, T. (2015). From news blogs to news on Twitter: Gatewatching and collaborative news curation. In Handbook of digital politics. Cheltenham: Edward Elgar Publishing.

Colleoni, E., Rozza, A. and Arvidsson, A. (2014). Echo chamber or public sphere? Predicting political orientation and measuring political homophily in twitter using big data. Journal of Communication, 64,317-332.

Dancygier, B. and Vandelanotte, L. (2017). Internet memes as multimodal constructions. Cognitive Linguistics, 28(3), 565—598.

Davey, J. and Ebner, J. (2017). The fringe insurgency: Connectivity, convergence and mainstreaming of the extreme right. Available at:

Davey, J. and Ebner, J. (2019). The great replacement: The violent consequences of mainstreamed extremism. Available at: wp-content/uploads/2019/07/The-Great-Replacement-The-Violent-Consequences-of-Mainstreamed-Extremism-by-ISD.pdf.

Diakopoulos, N. (2015). Algorithmic accountability: Journalistic investigation of computational power structures. Digital Journalism, 3(3), 398—415.

Ekman, M. (2014). The dark side of online activism: Swedish right-wing extremist video activism on YouTube. MedieKultur: Journal of Media and Communication Research, 30(56), 21.

Ernst, N., Engesser, S., Biichel, F., Blassmg, S. and Esser, F. (2017). Extreme parties and populism: An analysis of Facebook and Twitter across six countries. Information, Communication & Society, 20(9), 1347—1364.

European Commission. (2015). Eurobarometer 83: Public opinion in the European Union, July. Available at:

Farkas, J. and Schou, J. (2018). Fake news as a floating signifier: Hegemony, antagonism and the politics of falsehood. Javnost, 25(3), 298-314. doi: 10.1080/13183222.2018.1463047.

Farkas, J., Schou, J. and Neumayer, C. (2018). Cloaked Facebook pages: Exploring fake Islamist propaganda in social media. New Media & Society, 20(5), 1850—1867.

Flaxman, S., Goel, S. and Rao, J. M. (2016). Filter bubbles, echo chambers, and online news consumption. Public Opinion Quarterly, 8O(S1), 298—320.

Frissen, T. (2019). (Hard)wired for terror: Unraveling the mediatized routes and roots of radicalization. KU Leuven. Available at:

Gal, N., Shifman, L. and Kampf, Z. (2016). “It gets Better”: Internet memes and the construction of collective identity. New Media & Society, 18(8), 1698—1714.

Gillespie, T. (2018). Custodians of the Internet: Platforms, content moderation, and the hidden decisions that shape social media. New Haven, CT: Yale University Press.

Grabe, M. E., Zhou, S. and Barnett, B. (1999). Sourcing and reporting in news magazine programs: 60 minutes versus hard copy. Journalism & Mass Communication Quarterly, 76(2), 293—311.

Heikkila, N. (2017). Online antagonism of the alt-right in the 2016 election. European Journal of American Studies, 12(2).

Hepp, A. (2020). Deep médiatisation, deep médiatisation. New York: Routledge, doi: 10.4324/97813510 64903.

Ichau, E., Fnssen, T. and d’Haenens, L. (2019). From #selfie to #edgy: Hashtag networks and images associated with the hashtag #jews on Instagram. Telematics and Informatics, 44(September), 101275, Elsevier, doi: 10.1016/j.tele.2019.101275.

Jamieson, K. H. and Cappella, J. N. (2008). Echo chamber: Rush Limbaugh and the conservative media establishment. Oxford: Oxford University Press.

Knobel, M. and Lankshear, C. (2007). Online memes, affinities, and cultural production. A New Literacies Sampler, 29, 199—227.

Koehler, D. (2019). The Halle, Germany, Synagogue attack and the Evolution of the far-right terror threat. CTC Sentinel, 12(11), pp. 14—20. Available at: CTC-SENTINEL-112019.pdf.

Kramer, B. (2014). Media populism: A conceptual clarification and some theses on its effects. Communication Theory, 24(1), 42—60.

Leman, J. (2016). Socio-political dimensions of extremism. In Countering Violent Extremism: Mujahada and Muslims’ responsibility. Brussels. Available at: national-symposium-countering-violent-extremism-mujahada-and-mushms2019-responsibility-15-2013-16-march-2016-brussels.

Lorenz, T. (2020). Michael Bloomberg’s campaign suddenly drops memes everywhere. Available at: www.nytimes. com/2020/02/13/style/michael-bloomberg-memes-jerry-media.html.

Macklin, G. (2019a). The Christchurch attacks: Livestream terror in the viral video age. CTC Sentinel, 18—30, July. Available at:

Macklin, G. (2019b). The El Paso terrorist attack: The chain reaction of global right-wing terror. CTC Sentinel, 12(11). Available at: NEL-112019.pdf.

Maly, I. (2019). New right metapolitics and the algorithmic activism of Schild & Vrienden. Social Media + Society, 5(2). doi: 10.1177/2056305119856700.

Marda, V. and Milan, S. (2018). Wisdom of the crowd: Multistakeholder perspectives on the fake news debate. Internet Policy Review Series, Annenberg School of Communication. SSRN. Available at: https://

Massanan, A. (2017). # Gamergate and the fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New Media & Society, 19(3), 329.

Matamoros-Fernandez, A. (2017). Platformed racism: The mediation and circulation of an Australian racebased controversy on Twitter, Facebook and YouTube. Information, Communication & Society, 20(6), 930—946.

Metaxas, P. and Finn, S. (2019). Investigating the infamous# Pizzagate conspiracy theory. Technology Science. Available at: https://techscience.Org/a/2019121802/.

Midlarsky, M. I. (2011). Origins of political extremism: Mass violence in the twentieth century and beyond. Cambridge: Cambridge University’ Press.

Morozov, E. (2013). To save everything, click here. New York: Public Affairs.

Mourào, R. and Robertson, C. (2019). Fake news as discursive integration: An analysis of sites that publish false, misleading, hyperpartisan and sensational information. Journalism Studies, 20(14), 2077—2095. doi : 10.1080/1461670X.2019.1566871.

Napoli, P. M. (2014). Automated media: An institutional theory perspective on algorithmic media production and consumption. Communication Theory, 24(3), 340—360.

Newman, N., Fletcher, R., Kalogeropoulos, A. and Nielsen, R. K. (2019). Reuters institute digital news report 2019, June 12. SSRN. Available at:

Newton, K. (2019). Surprising news: How the media affect — and do not affect — politics. Boulder: Lynne Rien-ner Publishers.

Nguyen, N. M. (2016). I tweet like a white person tbh!# whitewashed: Examining the language of internalized racism and the policing of ethnic identity on Twitter. Social Semiotics, 26(5), 505—523.

O’Callaghan, D., Greene, D, Conway, M., Carthy, J. and Cunningham, P. (2012). An analysis of interactions within and between extreme right communities in social media. In Ubiquitous social media analysis (pp. 88—107). Berlin and Heidelberg: Springer.

O’Callaghan, D., Greene, D., Conway, M., Carthy, J. and Cunningham, P. (2015). Down the (white) rabbit hole: The extreme right and online recommender systems. Social Science Computer Review, 33(4), 459—478.

Pariser, E. (2011). The filter bubble: How the neu' personalized web is changing what we read and how we think. New York: Penguin.

Pattyn, B. (2014). Media en mentahteit. 3rd edn. Leuven: LannooCampus.

Pelletier-Gagnon, J. and Pérez Trujillo Diniz, A. (2018). Colonizing Pepe: Internet memes as cyberplaces. Space and Culture, doi: 10.1177/1206331218776188.

Poell, T. and Van Dijck, J. (2014). Social media and journalistic independence. Media Independence: Working with Freedom or Working for Free, 1, 181—201.

Schmid, A. P. (2013) Radicalisation, de-radicalisation, counter-radicalisation: A conceptual discussion and literature review. ICCT Research Paper. The Hague. Available at:

Sedgwick, M. (2010). The concept of radicalization as a source of confusion. Terrorism and Political Violence, 22(4), 479-494. doi: 10.1080/09546553.2010.491009.

Shifman, L. (2013). Memes in a digital world: Reconciling with a conceptual troublemaker. Journal of Computer-Mediated Communication, 18(3), 362—377.

Shifman, L. (2014). Memes in digital culture. Cambridge, MA: MIT Press.

Shoemaker, P. J. and Vos, T. (2009). Gatekeeping theory. London: Routledge.

Silverman, C. (2016). This analysis shows how viral fake election news stories outperformed real news on Facebook. Available at: (Accessed: 15 November 2018).

Sunstein, C. (2004). Democracy and filtering. Communications of the ACM, 47(12), 57—59.

Sunstein, C. R. (2007). 2.0. Princeton, NJ: Princeton University Press.

Tandoc, E. C. and Vos, T P. (2016). The journalist is marketing the news: Social media in the gatekeeping process. Journalism Practice, 10(8), 950—966.

Topinka, R. J. (2018). Politically incorrect participatory media: Racist nationalism on r/ImGoingToHell-ForThis. New Media & Society, 20(5), 2050—2069.

Venturini, T. (2019). From fake to junk news, the data politics of online vitality. In D. Bigo, E. Isin and E. Ruppert (Eds.), Data politics: Worlds, subjects, rights. London: Routledge.

Wall, T. and Mitew, T E. (2018). Swarm networks and the design process of a distributed meme warfare campaign. First Monday, 22, 1—33.

Wallace, |. (2018). Modelling contemporary gatekeeping: The rise of individuals, algorithms and platforms m digital news dissemination. Digital Journalism, 6(3), 274—293.

Walter, S., Brüggemann, M. and Engesser, S. (2018). Echo chambers of denial: Explaining user comments on climate change. Environmental Communication, 12(2), 204—217.

Yadlin-Segal, A. and Ramasubramanian, S. (2017). Media influence on stigma. In P. Rössler (Ed.), The international encyclopedia of media effects. Malden: Wiley-Blackwell.

Yoon, I. (2016). Why is it not just a joke? Analysis of Internet memes associated with racism and hidden ideology of colorblindness. Journal of Cultural Research in Art Education, 33.

Zannettou, S., Caulfield, T, Blackburn, J., De Cristofaro, E., Sirivianos, M., Stringhmi, G. and Suarez-Tangil, G. (2018). On the origins of memes by means of fringe web communities. In Proceedings of the Internet Measurement Conference 2018 (pp. 188—202), October. Available at: doi/10.1145/3278532.3278550.

Zuiderveen Borgesius, F, Trilling, D., Möller, J., Bodô, B., De Vreese, C. H. and Helberger, N. (2016). Should we worry about filter bubbles? Internet Policy Review. Journal on Internet Regulation, 5(1).

< Prev   CONTENTS   Source   Next >