Disembodiment and Data-ification of Experiences
There are many ways in which digital environments seem to have stripped material environments of their “readability”, thus pushing onto the individual the effort to decide what should be attended: the excessive complexity of the computational systems that create hierarchies and classifications that are opaque in their constitution (as is obvious in big data); the increasing standardization and fragmentation of activities to comply with a coding logic; the expansion of the networks of actors and the detachment of their traces from any specific identity. These are all different facets of a similar phenomenon that we could call disembodiment or data-ification of experiences.
Algorithmic systems, acting as new epistemic membranes, seem to increase the opacity of many social phenomena. They are also changing the ways individuals are (automatically) identified, tracked, profiled or evaluated, often in real time, adding opacity (invisibility) to traditional systems of identification, evaluation and, thus, of “government”. Automated, algorithmic systems are increasingly reading and editing behaviours, screening emotions, and calculating and measuring bodies, in order to profile users and to select the most appropriate information to display or decisions to propose. However, contrary to more classical social mechanisms of socialization and control, these systems are invisible and unintelligible as far as their actors and their normative frames are concerned. What is certain is that these processes challenge the notion of 'alterity', since they function on a principle of similarity—drawing profiles on what is common between individuals and similar others. In so doing, they raise the question of the possibility of an 'agora' as a space of difference and multiple “others”.
Control of attention is overtly fought over in the arena of consumption. For companies to succeed, it is vital to master and anticipate the intentions of consumers. Understanding and predicting intentions displaces the technological objective from the current world, which needs to be organized and structured, to the future one, which needs to be discovered and possibly fabricated and controlled. The traces that consumers leave behind, and that are constantly combined with the traces of similar individuals, allow this reconfiguration of the future. They not only help to generate profiles of consumers but also, more significantly, orient consumers' access to and perception of information and thus the range of decisions they can make.
These new techniques to attract and channel our attention aim at shaping our intentions in a sort of prospective or virtual loop. This has two consequences: the first is reflected on time and the second on social relations. The temporality of consumption is different from that of production. In most organizations, the digitization of operations and processes has been seen as a source of rigidity and even fossilization of practices, freezing all actors in a digital cage. In the marketing world there is a different logic: the objective is to create in real time and constantly renew the profiles of consumers. These information systems are designed not to support the slow pace of the production process but to reflect the fleeting time of consumers' attention, which must be constantly renewed and stimulated.
The identities and social relations, which emerge from these profiles, are volatile and piecemeal; they create categories, which individualize and separate more than they link and generate solidarities. The epistemological impossibility of determining what lies behind the groupings of individuals prevents any form of collective belonging, because the social categories and classes are essentially statistical and fluctuating. Furthermore, none of these categories is stable; on the contrary, they are permanently fluctuating. It corresponds to what T. H. Eriksen (2001) beautifully called “l'hégémonie des fragments”, the “hegemony of fragments”.
Alongside their opaqueness, computational systems by definition reduce and standardize actions. Binary systems strive to increase similarities rather than differences, fragmenting experiences into common chunks and processes. This is true both “behind the scenes”, in the logic of coding, and in the user interface, as attempts to thinly disguise the underlying entities with graphic interfaces cannot fully transform the common operations to be performed. The manipulation of symbolic elements on user interfaces that often carry the same logic across a great variety of tasks (manipulating a client record in a call centre is analogous to manipulating the configuration of temperature level in the control of an industrial process, which is not dissimilar to filling in a medical form) means that, from the cognitive perspective, users are operating at an extremely high level of abstraction and generality. This type of fragmentation of information, combined with the processes required by the computational models available, often decontextualizes single elements of information and contributes to users' sense of detachment. Finally, the fact that many activities are carried out as highly separated units and in social isolation also increases the sense of dis-embodiment.
To some extent, what we observe is the progressive dominance of a specific regime, which Boltanski and Thevenot (1991) would qualify as an industrial regime, based on predictions, risk management, evidence-based practices and 'procéduralisme'. The virtual and the real are questioned since what we observe in this evolution is part of what Kallinikos (2011) calls the long journey of human distancing from immediate, social, living context through its abstraction into formal systems and categories or the data-ification of life. Furthermore, most of these systems are increasingly considering the body and its biometric attributes as the only objective or authentic source of 'personal truth', based on the central hypothesis that “the body does not lie” (F. K. AAS 2006). The flip side of this assumption is a clear lack of confidence in people, their subjectivity and their agency.
Similarly, Merzeau (2009) observes that severing digital traces from their owners transforms them into entities available for administrative or commercial exploitation. Unbound from the person they belong to and identify, these traces are open to endless “remanufacturing as new strategies and requirements emerge.” (p. 24). This same phenomenon of distancing and objectification is what H. Nissenbaum (2010) addresses when she talks about “the loss of contextual integrity” to describe the risks associated with ignoring identity when following Web traces.
In summary, we are seeing computational systems that develop techniques to bypass individual intentions in favour of bodily states and statistical averages, and a concurrent transformation of all experiences into fragmented elements of data. The combination of these two trends amplifies the difficulty of individuals to attribute meaningful categories to the information they are attending to and increases their dependence on external mediators to filter and structure the content they are exposed to.