The themes underpinning, enabling, and ensuing digital frontier technologies

In discussing Al, the loT, distributed ledger technologies, and autonomous mobile robots, some common grounds begin to emerge. First, these technologies seem to be in a permanent state of transition and flux. As will be explored in the following chapters, the scale of development and progress is astounding, often described as exponential. These technologies also leave a digital trail by generating, storing, and exchanging data about their activity, us, and the environment. They are things, machines, or software applications that sense and exchange, collaborate, verify, and act, based on data, sometimes without our knowledge or input. Digital frontier technologies also raise concerns about (personal, community, national, border) security. Its users face implications pertinent to their privacy, but other actors who did not agree to technology’s ‘terms and conditions’ are also affected by it: bystanders, passengers, buyers, and sellers. We start by unpacking a complex phenomenon of big data.

Big data: The new oil?

An enormous amount of data is generated in our micro-universe every single day. From the moment we wake up and even as we sleep, a range of artefacts constantly observe, store, exchange, and analyse data about us and our activities. Data is a backlog, a history of our existence—and the one that we, at times, cannot erase. Often, we are unaware of the process of data creation. A computer logs in cookies every time you use it without you knowing what happens under the bonnet; smart toothbrush collects and stores data about your cleaning practice every time you turn it on; your car records data about travel and the health of the car. Examples mentioned above are what Rob Kitchin (2014) calls ‘automated’ processes of generating big data, in which records are created autonomously, without our interference. On top of this, we also ‘volunteer’ data by using social media platforms or loyalty cards. The third method of generating and collecting big data is directed—when human agents or organisations obtain data via, for example, public closed-circuit television (CCTV) systems. Once collected, a range of actors receive data about us and our activities: software developers, companies that own the product we wear or use, advertisers, banks, doctors, police, the government, and the list goes on. Notably, big data ‘appear to offer answers to a wide array of problems of (in)security by promising insights into unknown futures’ (Aradau and Blanke, 2017: 1). As such, commentators and stakeholders see big data as a tool that can help pre-empt future crimes. By examining data we analyse today but focus on the future. However, as I suggest later, ‘a lot can go wrong, when we put blind faith in big data’ (O’Neill, 2017: 2:11).

Definitions of big data often include ‘three Vs’: a volume of data, the increased velocity, and variety of collected data (Bennett Moses and Chan, 2014; Chan and Bennett Moses, 2016). Some authors add more Vs to the mix: veracity, value, and vulnerability (Zavrsnik, 2018). The power embedded in having access and control of big data is so immense that in 2006, the British mathematician Clive Humby (cited in Wall, 2018: 29) suggested that big data is the new oil. Similarity does not stop there: big data, like oil, cannot be used until refined. Data need to be analysed and transformed into a product that can be used for a variety of purposes: from predicting consumers’ behaviour and advertising products and services, to crime forecasting. Giving meaning to datasets transforms data to information generating a more comprehensive picture of the phenomenon that is structured and arranged in a particular context and provides knowledge (Kitchin, 2014; Lupton, 2014). Big data, thus, also refers to the process of using data more productively and profitably (Cale et al., 2020) via data mining—extracting patterns from large datasets. Put simply, data mining or ‘data crunching’ is making sense of data and using information in the most effective (or lucrative) way. Still, as boyd and Crawford (2012: 662-663) suggest, there is another essential element that defines big data (with technology and analysis): it is mythology as ‘the widespread belief that large data sets offer a higher form of intelligence and knowledge that can generate insights that were previously impossible, with the aura of truth, objectivity and accuracy’. This often-criticised notion that algorithmic decision-making is both omnipotent and bias-free will be explored at length in the following chapters.

In 2020, data-linkage as the process in which big data informs policy (Cale et al., 2020) is a standard practice across the globe. During the outbreak of COVID-19, for example, big data was at the epicentre of efforts to comprehend and forecast the impact of the virus, identify the carriers and infected persons, and determine the best practice in containing the disease (Bean and CIO Network, 2020). In the criminal justice system, big data has seen an application in crime prevention, surveillance, and prediction of the likelihood of offending and recidivism (Cale et al., 2020). The role of big data is so significant that it has changed the terminology' in almost all areas of criminological engagement with crime and punishment; crime-relevant knowledge is now routinely dubbed as databases, reasoning equals algorithms, crime prevention and investigation is called predictive policing, while criminal prosecutions became examples of automated justice (Zavrsnik, 2018). Big data also serves to promote specific political and financial interests: ‘doing more with less’ and getting more money for it is the bottom line for its use (Zavrsnik, 2018). The monetisation of our data and data about us is critical in the development of digital frontier technologies. However, this aspect of DFTs, while briefly mentioned, will not be the focus of the attention in this volume (for an overview of the literature and key themes see Powell et al., 2018).

Critically, most of us do not understand big data: how it works, or the algorithms used to process it. Creating strategies and the ability' to see through what Lyon (2015) calls ‘techno fog’ is an essential feature of this book. As I will demonstrate in the following chapters, in the future Internet the expansion and application of big data brings a concern that decisions about crime and responses to offending are likely to be nontransparent, and beyond our scrutiny. The prospect that such decisions might not be in the hands of human experts only complicate the matter. Decisions about our likelihood to commit a crime in the future, criminal accountability, sentencing, parole, and bail might be based on patterns identified via big data and performed and executed by algorithms.

 
Source
< Prev   CONTENTS   Source   Next >