If the results from two successive samples are very different, you don’t know if it’s because people’s attitudes or reported behaviors have changed, or the two samples are very different, or both. The powerful panel design deals with this. In a panel study, you interview the same people again and again. Panel studies are like true experiments: Participants are tracked for their exposure or lack of exposure to a series of interventions in the real world.
Perhaps the most well-known panel study of all time is the Framingham Heart Study. In 1948, medical researchers began tracking 5,209 men and women between 30 and 62 years old from one small town—Framingham, Massachusetts. In 1971, as the original panel began to die off, another 5,124 panelists were added—this time, the original panelists’ adult children and their spouses. Every 2 years, all the panelists go in for a complete medical check-up. This study has identified and nailed down the major risk factors for heart disease, which include behaviors (exercise, smoking) and inner states (attitudes, stress) that anthropologists, as well as epidemiologists, are interested in. Basic information about the Framingham study is available from the National Heart, Lung, and Blood Institute at http://www.framinghamheartstudy.org/index.html.
An important panel study in sociology is the Wisconsin Longitudinal Survey, which has followed 10,317 people in Wisconsin since they graduated from high school in 1957. The graduates were contacted in 1964 and again in 1975. In 1992-1993, the researchers tracked the 9,741 survivors of the original 10,317 and interviewed 87% of them by phone for an hour. The team did another round of interviews in 2004-2005 and they are planning another wave for 2022, when the Wisconsin high school class of 1957 will be 83 years old (Hauser 2005). The nonsensitive data from the WLS are available at http:// www.ssc.wisc.edu/wlsresearch, and the sensitive data (about, for example, sexual preference, addiction, mental health, or criminal behavior) are available to qualified researchers at http://www.ssc.wisc.edu/cdha/data/data.html.
Panel studies are rare in cultural anthropology but the few that exist make clear how important this kind of data is for tracking change over time (see Gravlee et al.  for a review). The Tsimane’ Indian Panel Study (TAPS), begun in 1999, follows the effects of market exposure on the Tsimane’ Indians in villages along the Maniqui River in the Bolivian Amazon (http://www.tsimane.org). Villages vary in how close they are to the market town of San Borja—and hence experience more or less market exposure—so, with panel data, the team can assess changes over time in things like farming practices, nutritional status, and ethnobotanical knowledge. For example, Godoy et al. (2007) found that the Tsimane’ protected their children’s food consumption during lean economic times. Vadez et al. (2008) showed the dramatic effect on deforestation caused by increased rice cultivation by the Tsimane’. Contrary to expectations, though, neither walking time to San Borja nor the presence of a permanent road had any effect on deforestation.
Panel studies can be done quickly—even within the year or two of most anthropological fieldwork. Amber Wutich (2009) studied the effects of water scarcity on social interaction in an urban squatter settlement in Cochabamba, Bolivia. After a couple of months of in-depth interviewing, she and several assistants began a five-wave panel study of random sample of 72 (out of 415) households. They interviewed people in each household every 2 months for 10 months and found that there was significant variation in the size of personal network across the five waves. As predicted by theory, just as the dry season began, people tried harder to mobilize their networks to get more water. Then, as the dry season advanced, people withdrew from their networks. They knew it was useless and couldn’t afford the risk that they’d have to reciprocate if they did score some water. And then, as the dry season ended, people went back to their old social interaction pattern (box 9.8).
People drop out between successive waves of panel surveys. If this happens, and the results of successive waves are very different, you can’t tell if that’s because of (1) the special character of the drop out population, (2) real changes in the variables you’re studying, or (3) both. For example, if dropouts tend to be male or poor, your results in successive waves will overrepresent the experiences of those who are female or affluent. If you run a panel study, consult a statistician about how to test for the effects of attrition.
Respondent mortality is not always a problem. Roger Trent and I did a panel study of
USING ETHNOGRAPHIC DATA IN PANEL STUDIES
Anderson-Fye (2004) studied the body image and eating behavior of 16 adolescent girls in San Andres, Belize. She began her work with the usual year-in-the- field stint (1996-1997), but went back in each of the next 5 years for several months and did in-depth interviews with each of her informants. While in high- school, the girls showed satisfaction with their body shape and image, but over time, 4 of the 16 developed attitudes or behaviors, or both, characteristic of eating disorders. For example, one of Anderson-Fye's informants, Kara, was happy with her body image, but wound up taking pills and exercising in an effort to be skinny. It turned out that her parents ran a gift shop that catered to American tourists and instructed Kara to be ''thin, pretty and friendly'' because, as they told Anderson-Fye, those were traits valued by Americans (p. 579). Ethnographic panel data made it possible for Anderson-Fye to track the emergence of these attitudes and behaviors and to tie them to likely causes.
riders on the Morgantown, West Virginia’s ‘‘People Mover,’’ an automated transport system that was meant to be a kind of horizontal elevator. You get on a little railway car (they carry only 8 seated and 12 standing passengers), push a button, and the car takes you to your stop—a block away or 8 miles across town. The system was brought on line a piece at a time between 1975 and 1980. Trent and I were tracking public support as the system went more places and became more useful (Trent and Bernard 1985). We established a panel of 216 potential users of the system when the system opened in 1975 and reinterviewed the members of that panel in 1976 and 1980 as more and more pieces of the system were added.
All 216 original members of the panel were available during the second wave and 189 were available for the third wave of the survey. Note, though, that people who were unavailable had moved out of Morgantown and were no longer potential users of the system. What counted in this case was maintaining a panel large enough to represent the attitudes of people in Morgantown about the People Mover system. The respondents who stayed in the panel still represented the people whose experiences we hoped to learn about (Further Reading: panel attrition).