The progression of the Russian model

The Russian government puts great emphasis on directing information operations — an umbrella term for propaganda efforts that today include computational propaganda. Their use of online manipulation strategies highlights the broader international phenomenon of computational propaganda. They primarily see information operations generally as ‘a decisive tool’ for maintaining social control rather than just ‘a supporting element of state power’ (Allen and Moore 2018, 61). This explains their government-sponsored attempts to mold public opinion through a diverse network of agents and channels, tracking and targeting audiences at home (Yapparova 2019) as well as abroad (MacFarquhar 2020). In scholarly debates, Russia’s propaganda efforts have been interpreted through the lens of the reflexive control theory (RC) (Thomas 2004; Bjola 2018; Till 2020; Kowalewski 2017). Developed in the wake of the Cold War by the Russian military, this elaborate form of propaganda is still seen by scholars as ‘an empirical black box’ (Bjola 2018, 23).

Russian attempts at reflexive control played out prominently during the 2016 US election. The IRA leveraged an army of roughly 400 human curators who used bots and other digital tools in disseminating content (Chen 2015). Till (2020) described this group as ‘fully employed “agents of influence’” used in an opaque digital form of Russian ‘statecraft’ (7). As a part of a minimum daily quota, each ‘agent’ was expected to post comments on 50 articles, run six Facebook accounts, post three posts on each Facebook account, hold two discussions on Facebook, and manage ten Twitter accounts, populating those with at least 50 tweets a day (Singer and Brooking 2018),

Conceptually, the goal behind RC is to control the reflex of the opponent by finding ‘the weak links’ (for example, racial tensions) and further exploiting them to ‘sharpen ideological polarisation, maximize political disunity and weaken democratic institutions’ (Bjola 2018, 22). Walker and Ludwig (2017) call this form of deliberate manipulation ‘sharp power’, which acts like ‘the tip of [the] dagger’, aiming to ‘pierce, penetrate, or perforate’ the media system of targeted countries. Bjola (2018) states that manipulation is accomplished through ‘cognitive mapping’ and ‘micro-targeting’ of the intended audiences (22).

The rise of advanced computation and big data has opened up a wide range of possibilities for novel reflexive control tactics, used by Russia and a variety of other powerful political actors. Specifically, these innovations allow for new means of ‘exploiting moral, psychological ... as well as personal characteristics’ in the form of‘biographical data, habits, and psychological deficiencies’ to acquire ‘the highest degree of reflex’, which maximises the chances of defeating the opponent by covertly influencing the other’s perception of reality (Thomas 2004, 241—242).

The sheer size of the global digital audience, with 3.8 billion users now present on social media (Kemp 2020), simplifies the task of orchestrating computational propaganda campaigns: the audience is vast, and exposure to a message is immediate. Moreover, automated technology used on Twitter and other social media sites is increasingly cheap and easy to obtain. An investigation from The Daily Beast found that a social bot account could be purchased online for less than five cents (Cox 2017). The fact that the Russian government inundated important political conversations with bots to tilt public opinion in its favour exemplifies a shrewd strategy of solving an essential dilemma between either selectively censoring problematic content or blocking Facebook and Twitter within Russia entirely (Sanovich 2019). Given the present power of social media in the global information infrastructure, neither of these measures would have been possible without causing a public outcry. In launching computational propaganda via reflexive control, Russia made a power move internally and externally by turning an old worry into a boon.

Early news coverage surrounding the 2020 US election suggest that the Russian propaganda machine made efforts to gear up for a more sophisticated and abstruse ‘reflexive control’ operation aimed at influencing the general public. A CNN investigation run in collaboration with researchers at Clemson University, Facebook, and Twitter found a network of more than 200 accounts stirring up racial tensions amongst the Black Lives Matter community, drawing thousands of shares and reactions. Masquerading as US-based users, those accounts turned out to be run by trolls based in Ghana and Nigeria (Ward et al. 2020). Facebook later deemed these accounts to be perpetuating ‘coordinated inauthentic behavior’ and linked their activities to the Russian IRA (Gleicher 2020).

Russian attempts to control US public opinion aren’t just occurring over social media. Another arm of the Russian government propaganda — Radio Sputnik (formerly the Voice of Russia and RIA Novosti) recently opened for business on US soil. Radio Sputnik now plays at 104.7 FM in Kansas City, Missouri. Acting through a broker in Florida, the Russian government prepaid the Missouri-based Alpine Broadcasting Corporation $324,000 for three years’ worth of news programming at an hourly rate of $49.27 (MacFarquhar 2020).

< Prev   CONTENTS   Source   Next >