BIASES FOR NAVIGATING THE SOCIAL ENVIRONMENT

The Social Exchange Heuristic

Standard economic principles predict that players in many single-interaction economic games should defect rather than cooperate, because defecting maximizes the player’s monetary payoff. The interaction is not repeated, and the players are usually anonymous strangers, so there is no incentive to signal cooperativeness for future interactions within the game or for the sake of reputation outside of the game. Yet cooperation often occurs in these economic games (Camerer & Thaler, 1995; Caporael, Dawes, Orbell, & van der Kragt, 1989; Sally, 1995), and this cooperation is cross-culturally ubiquitous (Henrich et al., 2001).

From a view of the mind as a rational utility maximizer, this pervasive cooperation is a puzzling phenomenon. Players act as if they expect negative consequences of non-prosocial behavior even when they are aware that, objectively, such consequences are unlikely to follow. Yamagishi and colleagues have proposed that the costs of falsely believing that one can defect without negative consequences are often higher than cooperating when one could safely defect (Yamagishi, Terai, Kiyonari, Mifune, & Kanazawa, 2007). This bias—dubbed the “social exchange heuristic”—can be conceptualized as a combination of error management and an artifact of modern living. Although this is no longer the case in many modern settings, in ancestral environments the probability of repeated encounters would have been high and social reputation effects potent. Therefore, selection may have crafted the social exchange heuristic as an adaptation to this ancestral cost structure.

The social exchange heuristic is well illustrated by the ease with which people can be made to feel they are “being watched.” Haley and Fessler (2005) asked anonymous strangers to play a series of dictator games (a type of economic game in which individuals “dictate” what portion of their endowment they will share with another player in the game) on the computer. For some of the participants, the researchers subtly manipulated visual cues by showing stylized eyespots as the computer’s desktop background. The effect of this manipulation was striking: When using a computer displaying eyespots, almost twice as many participants gave money to their partners compared with the controls. Whether they were aware of it, in a sense these participants acted as if they were being “watched.” (See Kenrick et al., this volume, for a discussion of the evolution of economic biases in human cognition.)

Similar error management logic could explain the ubiquity of religious beliefs (Johnson, 2009). The belief that a higher power is observing and judging one’s behavior could be adaptive (and hence lead to the evolution of religious belief), because it promotes cooperation and is associated with the benefits of forgoing immediate self-interest in the service of long-term cooperative benefits.

 
Source
< Prev   CONTENTS   Source   Next >