Menu
Home
Log in / Register
 
Home arrow Environment arrow Research Methods in Anthropology: Qualitative and Quantitative Approaches
Source

THE RESPONSE RATE PROBLEM

Mailed questionnaires can be very effective, but there is one problem with them that all survey researchers watch for: getting enough of them back. In 1936, the Literary Digest sent out 10 million straw poll ballots in an attempt to predict the winner of the presidential election. They got back 2.3 million ballots and predicted Alf Landon over Franklin Delano Roosevelt in a landslide. Roosevelt got 61% of the vote.

You’d think that 2.3 million ballots would be enough for anyone, but two things caused the Digest debacle. First, they selected their sample from automobile registries and telephone books. In 1936, this favored richer people who were more likely to be Republican. Second, the 2.3 million ballots were only 23% of the 10 million sent out. The low response rate biased the results in favor of the Republican challenger since those who didn’t respond tended to be poorer and less inclined to participate in surveys (Squire 1988).

How to Adjust for Nonresponse

Skip to 1991. The American Anthropological Association sent questionnaires to a sample of 1,229 members. The sample was stratified into several cohorts who had received their Ph.D. degrees beginning in 1971-1972 and ending in 1989-1990. The 1989-1990 cohort comprised 306 then-recent Ph.D.s. The idea was to find out what kinds of jobs those anthropologists had.

The AAA got back 840 completed questionnaires, or 68% of the 1,229, and 41% of those responding from the 1989-1990 cohort said they had academic jobs (American Anthropological Association 1991). The AAA didn’t report the response rate by cohort, but suppose that 68% of the 1989-1990 cohort—the same percentage as applies to the overall survey—sent back their questionnaires. That’s 208 out of 306 responses. The 41% who said they had academic jobs would be 85 of the 208 respondents; the other 123 had nonacademic jobs.

Suppose that everyone who didn’t respond (32%, or 98 out of 306) got nonacademic jobs. (Maybe that’s why they didn’t bother to respond.) In that case, 98 + 123 = 221 out of the 306 people in the cohort, or 72% got nonacademic jobs that year—not the 59% (100%—41%) as reported in the survey.

It’s unlikely that all the nonresponders were in nonacademic jobs. To handle the problem of nonresponse, the AAA might have run down a random grab of 10 of the nonresponders and interviewed them by telephone. Suppose that 7 said they had nonacademic jobs. You’ll recall from chapter 6 on sampling theory that the formula for determining the 95% confidence limits of a point estimator is:

which means that

The probable answer for the 10 holdouts is .70 ± .28. Somewhere between 42% and 98% of the 98 nonresponders from the 1989-1990 cohort probably had nonacademic jobs. In other words, 123 of the responders, plus anywhere from 41 to 96 of the nonresponders had nonacademic jobs, which means that between 164 and 216 of the 306 people in the cohort, or 54% to 71%, probably had nonacademic jobs.

Low response rate can be a disaster. People who are quick to fill out and return mailed questionnaires tend to have higher incomes and consequently tend to be more educated than people who respond later. Any dependent variables that co-vary with income and education, then, will be seriously distorted if you get back only 50% of your questionnaires. And what’s worse, there is no accurate way to measure nonresponse bias. With a lot of nonresponse, all you know is that you’ve got bias but you don’t know how to take it into account.

 
Source
Found a mistake? Please highlight the word and press Shift + Enter  
< Prev   CONTENTS   Next >
 
Subjects
Accounting
Business & Finance
Communication
Computer Science
Economics
Education
Engineering
Environment
Geography
Health
History
Language & Literature
Law
Management
Marketing
Mathematics
Political science
Philosophy
Psychology
Religion
Sociology
Travel