‘Psy-oping the home team’

Over this period, militaries struggled to ensure that doctrine kept abreast of technological advances, the new capabilities they made available, and their effects on the contemporary battlefield. When the USAF’s Basic Doctrine (2003 |1997|) was revised in 2003, it reflected the air force’s developing grasp of the battlespaces specific to information operations and the assets that had to be controlled to ensure information dominance. The revised doctrine proposed that the ‘action taken to affect adversary information and information systems while defending one’s own information and information systems’ was not a single entity but the product of three integrated non-kinetic actions: ‘Electronic Warfare Operations’, ‘Network Operations’, and ‘Influence Operations’ (USAF 2003, 31). While the first two focus on control over radio frequencies, ‘satellite links . . . telemetry . . . telecommunications; and wireless communications network systems’, influence operations take place in the ‘cognitive battlespace’.18 Here, ‘Influence operations . . . capabilities’ are deployed to

affect behaviors, protect operations, communicate commander’s intent, and project accurate information to achieve desired effects across the cognitive battlespace. These effects should result in differing behaviors or a change in the adversary decision process, which aligns with the commanders objectives.

(USAF 2003, 31)

Efforts to target enemy decision-making and influence foreign and domestic audiences were evident in an array of information operations undertaken in Afghanistan and Iraq. The Taliban’s only radio transmitter in Kabul was destroyed by cruise missiles in October 2001

when US forces invaded, ‘and the frequency was taken over by U.S. PSYOP’. Its broadcasts sought to ‘explain to Afghanis what happened in the United States on September 11, 2001, and why our government had decided to invade their country’. To maximise the audience and the information effect, the air force dropped ‘thousands of hand-cranked radios locked to U.S. PSYOP frequencies’ (King 2011, 8). In an effort to bolster its ‘source credibility’ in Iraq, the US planted ghost-written, pro-US messages in local newspapers, falsely attributed to Iraqi authors (Marx 2006, 51—59). On the home front, the Department of Defense Digital Video Imagery Distribution System (DVIDS) supplied ‘pre-packaged video, audio news stories, and images of U.S. military activity’, gathered by military public affairs personnel, ‘without charge, to any broadcast news organization in the world, including U.S. domestic channels’. Its ‘up-to-the-minute images and broadcast-quality video of military activity’ was supplemented by ‘a huge, accessible electronic library of previously produced images, video, and stories’. Sara King notes that while ‘all major U.S. networks, both over-the-air and cable, use DVIDS material’, their viewers would have had no idea which material had been gathered by professional journalists and which by uniformed servants of the military, as ‘Information provided by DVIDS is identified as “public” and users are not required to credit DVIDS when using the products that it provides’(King 2011, 11). Thus, military propaganda and the disinformation it entails is transformed into news.

British efforts to ensure an equivalent pipeline of military-sourced information coincided with and grew out of the drawdown in Afghanistan in 2014, which instigated ‘the biggest shake-up to military reporting in a generation’. As the traffic of British reporters through Camp Bastion dwindled to near zero, news management teams in the MoD were thinned out, and their remaining staff were redeployed to the media operations group (MOG) to work on ‘direct to audience communication’. Under these new arrangements, the gaps in information provision would be filled by uniformed ‘media operators . . . filming and photographing [the army’s] own operations, before posting the edited material online’ (Hill 2014).

These reforms were led by Stephen Jolly, a former instructor with 15 (UK) Psychological Operations Group (15 PsyOps). Jolly was ‘keen to see a greater emphasis on this kind of inhouse news-gathering, in which material is channelled through the open gateway of digital communication and social media’, free from oversight by the fourth estate. In 2014, when MOG and 15 PsyOps moved into neighbouring buildings at Denison Barracks in Berkshire to form the new security assistance group (SAG), their cohabitation laid bare the MoD’s determination to lower the firewall between news provision and information operations:

Traditionally, the two worlds of the MOG and Psyops have existed in separate universes, the former being expected to deal in the honest-to-goodness truth, the latter being more closely associated — fairly or unfairly — with the “dark arts”, usually directing its material at an enemy’s audience.

(Hill 2014)

Two years earlier, 15 PsyOps had been awarded the Firmin Sword of Peace for setting up and supporting seven local radio stations across Helmand.19 Research had revealed that, given the low rates of literacy among Afghans and negligible internet coverage, the most effective channel for psy-ops was radio.2'1 The unit’s CO, Commander Steve Tatham, insisted that the stations were committed to information provision, not covert influence:

Psy-ops is all about communicating with people around and on the battlefield, who ordinarily might not hear what’s going on. . . . Most of our work in Helmand is about talking to Afghans, and explaining and encouraging them to engage in the debate about what’s happening in their country.

(Wyatt 2012)21

Captain Kieron Lyons, who ran one of the stations and had previously ‘spent a lot of time planning the “information effect” for large-scale military operations’ in Afghanistan, acknowledged that while the broadcast material had to be truthful and attributable, its purpose was ‘to create behavioural change’ (Wyatt 2012).

To ensure that sufficient numbers of Afghans could tune in to the broadcasts from 15 Psy-Ops—sponsored Radio Tamadoun (‘all the Afghans I ever met called it what it was, Radio ISAF’), psy-ops personnel handed out thousands of wind-up radios to locals. Chris Green, who monitored coalition influence effects in Helmand, claimed he ‘never saw any of the locals actually using the radios or listening to Radio ISAF’. Despite claims that some DJs were attracting audiences of up to 50,000 for their shows, signal reception beyond the towns was ‘patchy at best and non-existent in many areas’. Further, phone-in reports revealed that most of the calls to the station’s much-vaunted talkback sessions ‘came from the same half-dozen callers’. In Green’s view, ‘by overstating the role and value’ of radio in the counter-insurgency, 15 Psy-Ops had been ‘psy-oping the home team’ (Green 2016).

Just a few months after its formation, SAG was absorbed into the newly formed 77th Brigade, where Tatham’s view that information, disinformation, and influence were indistinguishable was a basic operating premise.22 ‘Inspired by the successes of Israel and the USA’, the establishment of 77th Brigade was also a ‘response to Russia’s propaganda activities in the Crimea and the effective use of social media by the Islamic State’ (Merrin 2019, 122). Named in honour of Orde Wingate’s Chindit guerrilla force, 77th Brigade was tasked with bringing the same ‘spirit of innovation’ to the unorthodox environment of the online battlespace, where ‘the actions of others . . . can be affected in ways that are not necessarily violent’ (Beale 2015).

In July 2016, 77th Brigade established the organisation and command structure for both an overt online presence and its non-attributable covert systems. This resulted in the establishment of the digital operations group (Digi Ops), which is divided into two teams. Members of the production team ‘design and create video, audio print and digital products that aim to influence behaviours for both Army and external audience. Additionally, they advise on campaign strategy and propose innovative behavioural change methods’, while the Web Ops team ‘collects information and understands audience sentiment in the virtual domain. Within the extant OSINT policy framework, they may engage with audiences in order to influence perceptions to support operational outcomes’ (British Army 2020).23 The COVID-19 crisis revealed that one of the key targets for perception and behaviour influence was the British public. In April 2020, the chief of the defence staff, General Sir Nick Carter, disclosed that members of 77th Brigade were ‘helping to quash rumours about misinformation, but also to counter disinformation’. The information effects staff had been ‘tackling a range of harmful narratives online — from purported “experts” issuing dangerous misinformation to criminal fraudsters running phishing scams’ (D'Urso 2020).

While the Digi Ops team engaged in the open source environment with a range of actors, the delivery of covert strategic and tactical fires had passed to the task group, which provided ‘the deployable framework to deliver Information Activity and Outreach (IA&O)’ through one of its cells or teams (British Army 2020).24 Carl Miller suggested that the work of GCHQ’s Joint Threat Research Intelligence Group (JTRIG) provides a model for the sort of covert workundertaken by 77th Brigade. Our knowledge of JTRIG’s work conies from a series of slides Edward Snowden passed on to Wikileaks in 2013:

According to the slides, JTRIG was in the business of discrediting companies by passing ‘confidential information to the press through blogs etc.’ and by posting negative information on internet forums. They could change someone’s social media photos (‘can take “paranoia” to a whole new level’, a slide read). They could use masqueradetype techniques — that is, placing ‘secret’ information on a compromised computer. They could bombard someone’s phone with text messages or calls. JTRIG also boasted an arsenal of 200 info-weapons, ranging from in development to fully operational. A tool dubbed ‘Badger’ allowed the mass delivery of email. Another, called ‘Burlesque’, spoofed SMS messages. ‘Clean Sweep’ would impersonate Facebook wall posts for individuals or entire countries. ‘Gateway’ gave the ability to ‘artificially increase traffic to a website’. ‘Underpass’ was a way to change the outcome of online polls.

(Miller 2018)

Yet conventional militaries are still playing catch-up when it comes to the sophisticated deployment of disinformation. When ISIS forces seized Mosul in June 2014, their assault was spearheaded by a potent disinformation campaign. Employing Twitter, Snapchat, and other social media platforms, it publicised the gory fate that awaited those who defended Mosul. Over one 24-hour period, it issued almost 40,000 tweets, its output peaking at almost 200 per minute (Berger 2014). As a result, an attacking force of scarcely 1,500 ISIS fighters seized Iraq’s second city, whose 60,000-strong military and police detachment fled, their morale shattered by a precisely targeted disinformation offensive. This triumph brought home to conventional militaries around the world that they could not hope to match the enemy’s speed, agility, or virtual firepower.2’ They lacked the tools, the personnel, and above all else the organisational systems they needed to optimise and deploy the information assets they possessed.

Disinformation has become a key weapon on the information battlefields of modern conflict. For today’s militaries and the governments that direct them, the question is not whether to deploy disinformation against their adversaries but how to do so to best effect while retaining their credibility with domestic audiences. Truth was never more precious than it is now, so much so that the bodyguard of lies that once protected it has grown to become an army.

< Prev   CONTENTS   Source   Next >