Coda: Governing Technological Emergence in the Digital Society: Political Tensions and Ethical Dilemmas
The introduction of this book explores some concepts on the subject of governance that is the basis of various chapters and also serves as a link between them. This deliberate choice was not made to provide a ready-made or closed definition of governance. The empirical findings of this book do indeed suggest that the governing of science and technology is not predictable but that it takes different shapes and that it is enacted in different sets of practices within socio-technical and historical contexts. There is no singular definition of governance. Ethics should be recognized as an element of governance. Whereas governance is not a given, one can always think of better forms of governance. However, in scholarly literature, politics and ethics are often kept as separate domains. The contents of this book show that technology, politics and ethics are inevitably intertwined. Values are displayed in the practice of governance, and ethical dilemmas arise that need to be dealt with in democratic ways. The chapters of this book have combined approaches from STS and from ethics in an attempt to present governance as a set of practices, institutional sites, discourses, regulatory mechanisms and public responses that are not given, but are negotiated. Ethics is a regulating element in those negotiations as well as an effect of them. Only in this way could the governance of science be a democratic process.
While avoiding a single definition of governance, the introduction of the book provides some ideas on how governance is often understood as a working concept in scholarly and policy contexts. This conclusion revisits that discussion as a point of departure and as a way to make sense of the empirical findings of the book as a whole. As explained in the introduction, the book is a collective effort of a group of scholars working in a European context and collaborating on a project supported by the European Commission. In a European policy context, the emergence of technologies such as geo-engineering and smart ICT systems is often portrayed as the preferred formula to address grand challenges. It also emphasizes that these technologies are not only promising but also entail important potential risks. With this focus on potential, promises and risks, the governance of emerging technologies is
© Springer International Publishing Switzerland 2016 185
A. Delgado (ed.), Technoscience and Citizenship: Ethics and Governance in the Digital Society, The International Library of Ethics, Law and Technology 17,
then understood as being irremediably oriented towards the future and inhabited by the Collingridge dilemma:
The social consequences of a technology cannot be predicted early in the life of the technology. By the time undesirable consequences are discovered, however, the technology is often so much a part of the whole economic and social fabric that its control is extremely difficult. This is the dilemma of control. When change is easy, the need for it cannot be foreseen; when the need for change is apparent, change has become expensive, difficult and time consuming. (Collingridge 1980:11 in Nordmann 2010).
The ethical concern behind the Collingridge dilemma is with the impossibility of having control over technological development, use, application and impacts. The diagnosed problem is, eventually, the lack of knowledge. In response, throughout the years, a number of technologies of governance have been developed in a European policy context, such as different forms of public participation, opinion polls, scenarios, foresight exercises, etc. As part of this policy context, the authors of this book received economic support from the European Commission (FP7 program) to collaborate in a project called Technolife. This book is a result of that project. Using movies to trigger debate, Technolife developed three online forums to facilitate discussions on emerging technologies that increasingly affect people’s everyday lives. These technologies are digital maps, biometrics and human enhancement technologies (see the introduction to this volume). People whose everyday life was likely to be affected by the developments of such technologies were invited to participate. Perhaps the most remarkable result from the forums was that although people also showed a concern with ‘controlling technology’, the concern was much less on risks and much more on politics. Many participants expressed concerns that were not so much on the type of uncertainties and risks that come with technological emergence (as in the Collingridge dilemma), but on who controls technology and the resulting effect on people’s life. Many participants were critical of the monopolies of mass media and large corporations, and many participants saw politics and bureaucracies as out-dated institutions. While technology was often presented as having great potential for social change, institutions were seen as not being able to cope with changes:
In this era of rapid and sweeping advancement, we see the old world struggling to guide and restrain the process of advancement into the new (next?) world. Recording companies howl bloody murder in the old courts about people ‘stealing their livelihood’ by making and distributing pirate copies of their intellectual properties. Yesterday’s telephone companies become today’s facilitators of information and entertainment access. World governments gnash their teeth at the possibility of new technologies sparking sweeping economic change and the dashing of the old world’s entrenched economic power structures. Change will happen according to the will and abilities of the masses, regardless of the old world sensibilities (Body enhancement forum participant A)
The only thing I am concerned about is if all of this would be affordable to common people. I don’t care if someone doesn’t want to improve memory or add years to life or technologically advance their body. I care if someone wants to do that but lacks money (BH, forum participant B).
Right now, the biggest problem I see is the fact that these new technologies are being developed in a hyper-capitalistic environment and are being registered to pharmaceutical companies (referring to human enhancement technologies) (BH, forum participant C)
I’ve never seen an institution with the slightest interest in improving our lives (BH, forum participant D).
Expressed distrust in institutions was often accompanied with (sometimes extreme) technological optimism, especially concerning the type of distributed social action enabled by ICTs:
Push a little further down that line and we might see fully-automated virtual tools that let the layperson design unique organisms via their home computer and distribute the fruits of their labors to all interested parties across the globe with one tiny command. When that day comes, Big Pharma will compete against the ubiquity of information and the will of the people, and it will lose
Being imagined as ubiquitous and decentralized, digital technologies are said to potentially enable people with economic and political autonomy. Somewhat ironically, it is the assumingly deregulated character of ICTs that may enable people to have more control (in the sense of independency), as the quote above indicates. The use of ICTs in different technological and non-technical domains was emphasized in these debates as posing political challenges that institutions were not prepared to address. They were frequently discussed as issues of privacy (vs security), openness, distributed ownership, and freedom (for enhancing bodies and minds, among other things). As they were articulated in the debates, concerns were repeatedly expressed as to who controls technology and by which means. In these discussions, the participants of these forums confronted an institutionalized way of thinking and practicing the governance of emerging technologies that is still too focused on uncertainly, risk and evidence to emphasize governance as a political matter. Furthermore, issues such as privacy and access to technology and information were articulated as ‘claims to rights’ (Ruppert and Isin 2015). The formulation of those issues as claims to rights can be seen as enacting confrontation against mass security systems, surveillance or restricted access, as well as a way of confronting governance as usually exercised. In other words, the participants of these forums were performing not as only stakeholders but also as citizens (through formulating claims to rights), as they are usually considered in policy contexts, particularly in institutionalized public engagement exercises where issues are often formulated as public opinions, views or perceptions (which can also be viewed as a method of distilling political agency).
The chapters of this book echo the concerns that emerged in the Technolife discussion forums. They provide empirical findings and discussions around the issue of “controlling technology” paying attention to ethics, politics and citizenship, as they appear in three contexts of technological emergence. As technological interventions on the body of citizens, their movements and the space they inhabit are explored throughout the book, a political tension becomes apparent. ICT based technologies, such as digital maps, might produce a perplexing effect. These technologies are often inserted in larger infrastructures and allow new forms of institutional control, as we see in the case of biometrics and the attempts at controlling citizen’s movements across borders. For instance, the insistence on “interoperability” of biometrics systems would eventually enable an expanded control beyond national borders in the pursuit of a common European border. Therefore, interoperability turns out to be a technology of government. On one hand, digital platforms and devices may be seen as enabling a closer relation with technology, one in which people can to some extent exert a certain control over (and through) the technology, in their own lives and choices. This is not a straightforward effect because technologies can also produce an ambiguous feeling of being simultaneously gaining and losing control over one’s own life. Chapter 3, on Cochlear implants, shows how people using this technology are described as needing to sacrifice some important dimensions of the way in which they experienced their life in order to learn how to live with an enabling technology.
Recurrent attempts at gaining control over technologies that affect people’s live, reveal incipient ways of enacting values such as autonomy, privacy and freedom. As mentioned, these were values repeatedly invoked, discussed and claimed in the Technolife forums (interestingly, in all of the forums and in different technological fields). Sharing and open access were often invoked. Working within digital infrastructures, information appears to be more accessible and to circulate more openly, but digital applications can also be more easily tracked. The possibility of tracking the movements of citizens, combined with increasing systems interoperability, provides states with enhanced visibility and control capabilities. States cannot be autonomous in the development and implementation of such large scale technological systems because they are largely dependent on corporate interest. This tension between large, national and local scales, as well as between monopolies and distributed economic and political agencies, was mentioned many times in the forum and was explored in the section on biometrics. The findings from the forum suggest that the huge efforts deployed by governments to produce a technology of trust (to make Europeans feel safe in a “security envelope”) may reveal a lack of trust. Thus, we could conclude with Marilyn Strathern that “a benevolent or moral visibility is all too easily shown to have a tyrannous side—there is nothing innocent about making the invisible visible (2000: 309)”. What was eventually questioned in the section on biometrics was the type of visibility (and effects) that such technological systems and infrastructure deploy as they work to make the movement of citizens an object of transparency.
Finally, a tension that was recurrently referred to in the Technolife forums was the tension between the type of individual and collective political agencies entangled and produced within the processes of technological emergence. “Many share the commitment to values such as pluralism and the individuality of choice. What’s more, many explicitly articulate these values in direct connection with a concern over standardized, top-down institutions, modes of production, distribution of goods, resources and information” (Strand and Rommetveit 2011). Technological decentralization, individual freedom and diversification were opposed to controlling governmental agencies that are known to impose regulation and standardization. However, the multiple criticism to monopolies and capitalism was difficult to interpret, as it could lean toward libertarianism, extreme liberalism (or indeed towards something not known yet). In the governance of emerging technologies, such as those explored in this book, that tension is performed within practices of information sharing. As in the sharing economy, broad circulation of information in networked platforms is expected to produce value. By using digital maps, people give away some information to then be able to make other choices. One example of this would be customizing spaces through apps in cell phones. The figure of the citizen as a consumer co-exists with collective forms of action enabled by technology, such as when people produce maps for monitoring their environment, or visualizations of polluted areas.
This conclusion summarizes some of the tensions that arise in the governance of emerging technologies that were main findings of the Technolife project and that are reiterated by the findings of this book. As these findings suggests, a main challenge is for institutions to respond to a real state of affairs and to address emerging technologies not only as a tool to cope with societal challenges but also as embedded within the constellations of socio-political relations, posing new societal challenges as they emerge. Today, ICTs are loaded with optimism and hope on the side of both institutions and the citizenry; however, in the pursuit of better forms of governance, the reconfiguration(s) of power relations that comes with technological emergence should not be concealed in extreme technological reliance.
Nordmann, Alfred. 2010. A forensics of wishing: Technology assessment in the age of technoscience. Poiesis and Praxis 7(1): 5-15.
Ruppert, E., and E. Isin. 2015. Being digital citizens. London: Rowman and Littlefield International. ISBN 978-1-178348-055-5.
Strand, R., and K. Rommetveit. 2011. Technolife project final report. Retrieved from http://techno- life.no/content/filelist_b9b0f429-0e6c-49f5-8944-24541635e46e/1336510179106/technolife_ final_report_for_website.pdf.
Strathern, M. 2000. The tyranny of transparency. British Educational Research Journal 26(3): 309-321.