Section VI. Interview Pace and Behaviors

Exploring the Antecedents and Consequences of Interviewer Reading Speed (IRS) at the Question Level

Introduction

The pace of interviews (or how quickly interviewers and respondents move through a survey) may vary considerably across interviewers (e.g., Cannell, Miller, and Oksenberg 1981). Researchers have extensively examined the effect of interviewers' pace during survey introductions (which are often unscripted) on participation (e.g., Groves, et al. 2008; Oksenberg and Cannell 1988; Oksenberg, Coleman, and Cannell 1986). Although research investigating the effect of pace within the survey itself is more sparse, there is some evidence that faster pace is associated with lower data quality (Fowler and Mangione 1990; Vandenplas, et al. 2018). Although standardized interviewer training typically includes instructions to read questions slowly based on the assumption that doing so maximizes data quality (e.g., Alcser, et al. 2016; Fowler and Mangione 1990), much research examining interview pace during interviews has operationalized pace using completion times for modules or entire interviews (e.g., Loosveldt and Beullens 2013; Olson and Peytchev 2007; Vandenplas, et al. 2018, Olson and Smyth, Chapter 20, this volume). Overall interview completion time is affected by interviewer reading speed (IRS), but it may also be affected by other factors (e.g., interviewer reading errors and respondent questions).

Our research extends this work to examine the antecedents and consequences of IRS at the question level across a range of questions. Using data from in-person interviews in a laboratory setting, we test hypotheses about the effects of interviewer experience, question length and sensitivity, and location in the questionnaire on IRS. We also examine the effects of other question characteristics on IRS, including degree of question abstraction and response format. Finally, we examine the impact of respondent and interviewer race and ethnicity (interviewers were matched to respondents on race/ethnicity) on IRS.

Although understanding the antecedents of IRS is important, IRS is also of interest to researchers because it may influence the process by which respondents answer questions. We test the impact of IRS on three measures of the respondent's response process: response latencies and respondent behaviors that may indicate comprehension and mapping difficulties.

Background

One of the key goals of standardized interviewing is to minimize interviewer-related sources of measurement error, including the speed with which interviews are completed (Fowler and Mangione 1990; Loosveldt and Beullens 2017). Total completion times vary across interviewers (Couper and Kreuter 2013; Loosveldt and Beullens, 2013; Vandenplas, et al. 2018), but few studies have specifically examined the speed with which interviewers read questions (independent of other behaviors that might affect overall interview length). Interviewer reading speed is of particular concern because reading speeds that exceed typical conversational norms may compromise respondent comprehension and contribute to measurement error. Subsequently, further investigation of interviewer pace, and specifically IRS, is needed.

Researchers have examined the effects of respondent, interviewer, question, and study design features on interview pace. Older respondents and those with fewer cognitive skills tend to have longer interview times (Couper and Kreuter 2013; Loosveldt and Beullens 2013; Olson and Smyth 2015). Employment status and computer use have also been found to be associated with interview time (Olson and Smyth 2015).

Overall, evidence suggests interviewers have a greater influence on pace than respondents (Loosveldt and Beullens 2013). Studies have found that greater interviewer experience (often operationalized as experience interviewing on a particular survey) is associated with faster interview pace (Bergmann and Bristle 2016; Loosveldt and Beullens 2013; Olson and Peytchev 2007, Garbarski, et al., Chapter 18, this volume). When examining pace at the item level, Olson and Smyth (2015) found that greater interviewer experience was associated with faster response times (including the whole interaction between the interviewer and respondent) for yes/no questions, but slower response times for open-ended questions. Other question characteristics are also known to be associated with interview speed, including question length (Couper and Kreuter 2013; Olson and Smyth 2015), question sensitivity, and question type (Olson and Smyth 2015).

There is also some evidence that at least one element of study implementation - piecemeal payment of interviewers, rather than on an hourly basis - increases interview pace (Bergmann and Bristle 2016; Loosveldt and Beullens 2013). Interviewers may be less motivated to carefully follow standardized protocols and have a strong incentive to finish interviews faster when they are paid by the interview than by the hour (Bergmann and Bristle 2016; Cannell 1977), and payment per completed interview may also compromise data quality and lead to falsification.

Faster survey interviews have also been found to be detrimental to survey quality, increasing both straight-lining (Fowler and Mangione 1990; Vandenplas, et al. 2018) and the likelihood of providing "don't know" responses (Vandenplas, et al. 2018). Although empirical research is currently unavailable, common sense suggests that faster interviewer reading of questions may also lead to respondent difficulties with question processing.

 
Source
< Prev   CONTENTS   Source   Next >