Engineering the Uncanny

Blendie is a kitchen mixer that the user speaks to and that speaks to the user (Dobson 2007a).11 To interact with Blendie, users mimic the sounds made by a common kitchen mixer, and Blendie responds with a mechanical rendering of those sounds repeated back to the user, produced by varying the speed of its motor. As depicted in the project documentation,12 a person approaches Blendie and begins to make all variety of grunting and whining machine-like sounds of high and low pitch. This instigates a response from Blendie of either fast or slow rotation of the motor, which produces a corresponding whirring sound. Over time, the back and forth of sounds uttered and imitated between Blendie and the user begins to suggest a dialog of sorts, albeit a strange one.

In an encounter between Blendie and a user, the user begins to transform herself to be more machine-like to elicit a response from the mixer.

Figure 3.1

Kelly Dobson, Blendie (2007a)

Adjusting one's modes of behavior to elicit responses from others is common and is fundamental for effective communication. Moreover, people often adjust their behavior to make use of or otherwise interact with machines—for example, slowing one's pace as approaching automatic doors to allow time for the system to note one' s presence and respond by opening the doors. What is uncommon and arresting with Blendie is the requirement of dramatically adjusting human communicative modes with machines to, in effect, perform machine behavior.

This encounter between a person and a robotic kitchen mixer was designed by Kelly Dobson. Blendie is part of a larger body of work by Dobson titled Machine Therapy, which "tweaks technological artifacts in order to explore their sensitive and emotional side" (Dobson 2007b). Like PARO the baby seal robot, Blendie is designed for therapy. But the purposes and modes of therapy between these two robots are remarkably different. Whereas PARO 's therapeutic purpose is alleviating human loneliness and remedying a perceived lack of affective exchange, Blendie's therapeutic purpose is the reflective exploration of human relationships with machines. With Blendie, the subject of therapy with the device is our relationship with devices. In addition, the mode of therapy advanced by Dobson's design is in stark contrast with that of PARO. Whereas PARO is designed to elicit and support a placid, soothing experience of therapy grounded in amity as healing, Blendie is designed to support a model of therapy grounded in unease and confrontation as the cure—the model of psychoanalysis (Dobson 2007a).

The Machine Therapy project is agonistic in that it provides alternate modes of interacting with robots that bring to the fore, rather than mitigate, the tension and anxiety that frequently characterize our relationships with technology. Throughout entertainment media, robots are often used to signal and explore these tensions and anxieties in amplified form. In many film representations of robots—Maria in Metropolis (1927), Deckard in Blade Runner (1982), the Terminator series (1984, 1991, 2003, 2009), the artificial boy David in Artificial Intelligence: AI (2001)—the robot is a figure at odds with its identity and our relationship to it. In Metropolis, it is seductively attractive yet inhumane and despotic; in Blade Runner, it is self-loathing; in the Terminator series, it is a brutal assassin transformed into a brutal savior; and in Artificial Intelligence: AI, it is a discarded anthropomorphic appliance. Exploring these tensions and anxieties through an object such as Blendie continues the tradition of the robot as a kind of reflective other but with an opportunity for interaction unavailable through other forms of media. And although this tradition of the robot as a reflective other exists within the cultural representations of robots (in films, television, and fiction), it is conspicuously uncommon in actual social robot design.

In giving voice to the tension between people and technology, Blendie affectively manipulates that anxiety and plays on the uncanny, which is a particularly compelling trope for exploring human-robot relationships because it operates by troubling existing categories of form, function, purpose, and being. In his essay "Das 'Unheimliche,'" Sigmund Freud characterized the uncanny as an experience in which the familiar suddenly becomes strange, resulting in a sense of psychological fear (Freud 2003).13 Freud explores several instances of the uncanny and the common themes that run through them. One theme is animism and anthropomorphism, or the attribution of lifelike, human qualities to inanimate objects. This theme is still explored today and is found in many films that cross the boundary between science fiction and thriller and present visions of computational systems gone awry. In The Shaft (2001), an intelligent elevator is out for revenge, and in One Missed Call (2008),14 mobile phones and data networks are possessed by a malevolent entity. More generally than elevators with a vengeance or malevolent mobile phones, the uncanny can be taken to be the wearing away of the distinction between the real and the imagined, "when we are faced with the reality of something that we have until now considered as imaginary" (Freud 2003, 150).

The uncanny has also found its way into robotics research, through the notion of the the uncanny valley. In 1970, roboticist Mashiro Mori postulated that as a robot appears more human, our acceptance of it increases. This acceptance follows an upward curve until the robot 's resemblance to a human being reaches what is called the uncanny valley—a conceptual space in which the resemblance between a robot and a human are almost identical, and the tension between this difference and sameness is disturbing (Mori 1970). Most often, the uncanny valley refers to the visual appearance of the robot, but it is not limited to the visual realm alone. Mori accounted for a more nuanced notion that extends visual appearance to include aspects of presence, behavior, and interaction—a more robust notion of the uncanny that spans various kinds of embodiment. Although often the visual appearance seduces us into thinking that an object is real or alive, usually other forms of embodiment produce the experience of the uncanny. Consider an example of the uncanny valley used by Mori— shaking hands with a corpse. We expect the hand to feel warm and supple, and yet it is cold and rigid, disturbingly contrary to our notions and personal experiences of bodies. Even though the uncanny valley has just begun to be systematically examined (as much as it can be), the idea has been perpetuated in the robotics community over the past several decades.15 Generally, it is deemed a place to be avoided in the design of robots, particularly robots intended for interaction with humans, because it is seen as a hindrance to the acceptance of robots.

In the context of social robot design, the uncanny is a theme and sensation that reflects the remainder. What is veiled or excluded by design in social robots is the apprehension, confusion, and anxiety often experienced by people who encounter objects that feign to be something other than they are and that invite interaction with them in a personal, even intimate, manner. From an agonistic perspective, however, the reasons that most researchers and designers attempt to avoid the uncanny can be recast as reasons to induce it. Uncanny encounters with robots produce troubling engagements between intelligent artifacts and people. They prompt reflection on the nature and substance of the relations between people and robots.

What is experienced or witnessed with Blendie is a reconfiguration of robot therapy and human-robot relations that leverages the uncanny to produce an agonistic encounter. This encounter is agonistic in that the very anxieties and tensions between intelligent artifacts and people that are usually smoothed over by design are here, by design, made into the basis of the interaction. Interacting with Blendie occurs not through placid stroking, but rather, through an agitated exchange of growling at it, and its growling back.

This reconfiguration is materially and experientially enacted through the design of the robot's embodiment. For Blendie, Dobson developed audio sensors and software that were fitted within the housing of a commercial kitchen mixer. The audio sensors monitor and register sound (the noises made by humans as they grunt and whine at the machine), analyze the sound for frequency and pitch, and then translate those qualities of human sound into numeric values that can be used to vary the speed of the mixer's motor, which acts as Blendie's projected voice. This may seem like a simple form of coupling, but it is not simplistic. It demonstrates a sophisticated understanding of how to sculpt embodiment by design through the medium of computation (figure 3.2). As Dobson (2007a, 80) explains:

Blendie works by taking in the sound of a person interacting with it through a microphone and processing that sound on a computer running custom software written in C++. The program computes an STFT (short time Fourier transform) to detect the dominant pitch, and an FFT (fast Fourier transform) of this STFT to look for time-domain frequency modulation. If it detects modulation in a range that has been predetermined as a close human approximation to the rough guttural sound of the blender's motor, Blendie then is given the correct amount of power to allow it to spin at a speed that will produce the same dominant pitch of the person's voice. The power is adjusted using PWM (pulse width modulation) of the AC (alternating current) line supplying power to Blendie. The proper PWM for a given pitch is returned from a large lookup table in the software custom made for the blender. The software can tell a human voice from a blender sound, and thereby can keep Blendie from forever feeding back on itself, because a human imitation of a blender is very different from the sound of the blender itself.

Dobson 's design evokes the uncanny by intentionally reconfiguring the standard mode of embodiment from human language to machine sounds. This inverts the common relationship of person to machine, in which the person is (at least theoretically) given prominence. Thus, the design of Blendie's embodiment is a machine-centric embodiment. The basis for the coupling is in machine terms rather than human terms.

This shifting of the basis of embodiment draws attention to a we/they relationship. Within theories of agonism, the notion of a we and a they—of an us and a them—is central to establishing difference and an adversarial stance (Mouffe 2000a, 2005b). Through these categories of difference, distinctions between beliefs, values, and practices are organized and

Figure 3.2

Sketch depicting the design of Blendie's embodiment, Kelly Dobson (2007a)

expressed. The performance or acting out of the relationship between these categories, which is often one of tension, is the conflict that defines agonism and represents the political condition. However, the categories regarding social robot design are notably different. Rather than familiar distinctions of we and they, such as the ideological left and right, liberal and conservative, or pro and anti any given subject, here the we/they is, at least initially, a distinction between humans and machines. The tension that is identified and brought to the fore concerns how we conceptualize what is human and what is machine and how these conceptualizations inform and interact with each other.

But beyond just establishing those boundaries of and interactions between the we of humans and the they of robots, the agonistic endeavor in adversarial design is to investigate and question them as categories with political significance that are open to critique and reinterpretation. The challenge and opportunity of the uncanny as witnessed in Blendie goes beyond questions of whether a thing is or is not human, whether it is or is not alive. In fact, uncanny encounters transform the nature of we/they relations beyond initial simplistic categories of human and machine. Rather than rashly either valorizing those categories or making claims of dismantling them, one should instead consider the qualities that underlie those categories and the permeability between those categories. The political issue of concern for design with social robots is not registering or denying humanness. Rather, the political issue of concern for design is how people and robots are going to comingle.

 
Source
< Prev   CONTENTS   Source   Next >