If you are trying to write good survey questions in a language other than your own, the best practice is the method of back translation (Brislin 1970; Werner and Campbell 1970).

BOX 9.6


Use cognitive testing (also called thinkaloud) in pretesting questions (Willis 2005). In this method, people think out loud as they decide on how to answer each question in a survey. There are three alternative outcomes when with the thinkaloud technique: (1) People understand the question just as you intended them to; (2) People understand the question very well, but not the way you intended them to; and (3) People don't understand the question at all. Edwards et al. (2005) used this method to pretest a 28-question survey on the use of condoms by women sex workers in Mombassa, Kenya. The result was a survey with culturally appropriate vocabulary for various types of sex clients.

Don Dillman and his colleagues at the Washington State University survey research center begin thinkaloud interviews with two questions: (1) How many residences have you lived in since you were born? (2) How many windows are in your home? On the first question, some people think of cities where they've live, while others try to think of individual residences. On the second, questions come up, like: ''Is a sliding glass door a window?'' These questions help respondents understand what the interview is really about: learning where the ambiguities are in questions (Dillman, Smyth, and Christian 2009:221-23) (Further Reading: thinkaloud and cognitive interviews).

First, write any questionnaire in your native language. Then have the questionnaire translated by a bilingual person who is a native speaker of the language you a working in. Work closely with the translator, so that she or he can fully understand the subtleties you want to convey in your questionnaire items.

Next, ask another bilingual person, who is a native speaker of your language, to translate the questionnaire back into that language. This back translation should be almost identical to the original questionnaire you wrote. If it isn’t, then something was lost in one of the two translations. You’d better find out which one it was and correct the problem.

Beck and Gable (2000) developed a scale for screening postpartum women for depression and then translated the scale into Spanish (Beck and Gable 2003). One item on the original scale was ‘‘I felt like my emotions were on a roller coaster.’’ The first translator offered two options for this: “Senti un sube y baja emocional’’ and “Senti un desequilibrio emocional.’’ The second translator translated these as ‘‘I felt like my emotions were up and down’’ and ‘‘I felt emotional instability” (Beck and Gable 2003:69). Not exactly the same feeling as ‘‘I felt like my emotions were on a roller coaster,’’ but close. Do you go with one of the two Spanish translations offered? Which one? Or do you keep looking for something better in Spanish?’’ The answer is that you sit down with both translators, talk it through, and come to a consensus (box 9.7).

You can use back translation to check the content of open-ended interviews, but be warned: This is tough work. Daniel Reboussin (1995) interviewed Diola women who had come from southwestern Senegal to Dakar in search of work. All the women used French at work, but they preferred Diola for interviews. Reboussin, who speaks French, spoke very little Diola, so he worked with an interpreter—a man named Antoine Badji—to develop an interview schedule in French, which Badji translated into Diola.

BOX 9.7


For all its rough edges, on-the-fly translation is probably just fine for a lot of research. Since 1984, the Demographic and Health Survey (DHS) has been conducted in 75 countries across the developing world to provide data on reproductive health. Each survey is meticulously translated into local (not just national) languages. In Kenya, for example, the DHS is produced in 10 local languages, as well as in English and Swahili. Interviewers are assigned to regions where their native language is dominant. There is always some population mixing, so interviewers get a chunk of survey materials in their own language as well as a chunk in Swahili, the national language. The problem is, lots of people don't speak Swahili. So, when a Luo interviewer runs out of interviews in Luo and has to interview a Luo speaker who doesn't speak Swahili, she has to translate the questions, from a Swahili interview, on the fly. This on-the-fly translation happened in 23% of the 7,480 interviews in the 1998 DHS. Weinreb and Sana (2009) found that this made no statistical difference in the univariate data for 22 out of 24 variables in the survey, from household characteristics to reports about use of contraceptives.

During the interviews, Badji asked questions in Diola and Reboussin audiotaped the responses. After each interview, Badji translated each tape (orally) into French. Reboussin transcribed the French translations, translating into English as he went. Then he read his English transcriptions back to Badji, translating (orally) into French as he read. That way, Badji could confirm or disconfirm Reboussin’s understanding of Badji’s French rendering of the tapes.

As I said, this was tough work. It took Reboussin and Badji 17 weeks to conduct 30 interviews and get them all down into English.

< Prev   CONTENTS   Source   Next >