Intersectionalities: Targeting the 'Other'

In 2015, Google rolled out a new Al-powered feature for its Photo app that analysed photos by tagging them as ‘beaches’, ‘cities’, ‘animals’, and ‘people’. After a few months, following a notorious incident where Google Photos labelled a woman of colour as ‘gorilla’ (see Bunz and Meikle, 2018: 90-91), Google admitted it failed to optimise its app for people with black skin. Occasionally, neural networks that underpin Al learning and computational process fail. While this is not strictly an example of the loT technology, we can draw an analogy: objects networked in the loT ‘see’, but their vision might be somewhat limited. If there is an unfair bias towards accurately mapping and recognising a nonnative standard for facial structure—as seen in a white ‘norm’ example in above-mentioned ‘gorilla’ incident—then does the software embedded in the loT also risk homogenising ‘Others’ and unfairly discriminating by potentially labelling them as threats, at a greater frequency? Certainly, as Lyon et al. (2012) argue and as suggested in this chapter, contemporary surveillance captures everyone, including groups that historically managed to avoid such scrutiny. However, as Lyon et al. continue, this often translates into new asymmetries, in which those in the position of power emerge more powerful, while marginalised end up being over-surveilled. Ubiquitous surveillance often reinforces and exacerbates existing inequalities. It is, therefore, worth exploring whether marginalised communities such as refugees, asylum seekers, ethnic and racial minorities, and former colonised subjects are more likely to be unfairly targeted by the development and deployment of the loT. As Simone Browne (2012) powerfully demonstrates, racialised surveillance is alive and kicking, and there is a claim that the loT systems are likely to make matters worse, given that surveillance is always accompanied, if not inspired, by social sorting (Lyon, 2018).

Some of these fault lines are already visible. As Andrejevic (2012: 94) suggests, ‘relatively affluent groups and places are subject to more comprehensive forms of commercial monitoring, whereas less affluent groups and places are targeted by policing and security-oriented forms of monitoring’. The implications of these diverse forms of surveillance are clear: as a white upper-middle-class female, I will be targeted by advertisers; a woman of colour is likely to be targeted by agencies of crime control. Means Coleman and Brunton (2016) assert that this type of technological institutionalisation of social inequality is evidenced by over-policing and disproportionately high surveillance measures deployed in impoverished African American neighbourhoods. Through the loT, socially stigmatised ‘Other’ could be not just over-surveilled but also discriminated against. Examples such as using data from the loT as grounds for prevention of exit, visa revocation, or denial of services established before people’s application for travel, visa, or asylum is finalised are not far-fetched. The loT systems have the potential to monitor those deemed ‘risky’ at all times as they go about their day-to-day lives. As such, the technology can unfairly discriminate populations labelled as ‘deviant’ by way of race, ethnicity, religion, and social status, and, therefore considered ‘inherently inclined’ towards criminal or anti-social behaviour.

 
Source
< Prev   CONTENTS   Source   Next >