EMBODIMENT AS A TOOL FOR IMPROVED SOCIAL BEHAVIOR
Underlying embodiment may even shed light on certain developmental disorders with a large social component, such as autism. For example, in contrast to typical participants, autistic individuals do not spontaneously reproduce (mimic) facial expressions when they “just watch” them, that is, without any prompts to recognize the expressions or to react to them (McIntosh, Reichmann-Decker, Winkielman, & Wilbarger, 2006). Even when autistic individuals are explicitly asked to focus on recognizing them, their mimicry is delayed (Oberman et al., 2009). Because numerous other studies have shown that spontaneous mimicry aids emotion recognition, there is reason to suppose that such deficits may hinder understanding of nonverbal cues by autists (see Winkielman, McIntosh, & Oberman, 2009 for a fuller review of theory and evidence in this area). People affected by autism have also been shown to have impairments in non-emotional empathy and understanding of “other minds” (mentalizing). As discussed, these skills are partially supported by the ability to construct an embodied simulation of the other.
Interestingly, if it is indeed true that embodiment is part of the autistic deficit, it should be possible to improve these individuals’ real-life emotional communication skills by training embodiment. Success in such a program would also provide a powerful example of how theories of social cognition can inform and facilitate actual interpersonal behavior. One domain where this can be easily achieved is facial mimicry, where quick motor reactions to faces can be developed by frequent pairing of a stimulus and motor response (smile to smile, frown to frown). We are currently testing this idea in our lab by using a training paradigm in which typical and autism spectrum disorder (ASD) participants produce facial expressions in response to schematic facial stimuli in a video game, which previously was used to train face recognition (Tanaka et al., 2010). We are also planning an intervention program with a humanoid robot that makes realistic facial expressions (Wu, Butko, Ruvulo, Bartlett, & Movellan, 2009). An interested reader can find several videos of this robot via a simple Internet search with words “einstein robot ucsd.” We hypothesize that these perception-action pairings will enhance the ability of ASD participants to quickly mirror facial expressions, which in turn not only will facilitate their recognition of faces but also will make others judgment them as more socially appropriate.