Asking Probing Questions in Web Surveys: Which Factors have an Impact on the Quality of Responses?

被引:33
作者
Behr, Dorothee [1 ]
Kaczmirek, Lars
Bandilla, Wolfgang [1 ]
Braun, Michael
机构
[1] GESIS Leibniz Inst Social Sci, Dept Survey Design & Methodol, D-68072 Mannheim, Germany
关键词
web survey design; probing; open-ended questions; cognitive interviewing; nonprobability online panels; OPEN-ENDED QUESTIONS;
D O I
10.1177/0894439311435305
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Cognitive interviewing is a well-established method for evaluating and improving a questionnaire prior to fielding. However, its present implementation brings with it some challenges, notably in terms of small sample sizes or the possibility of interviewer effects. In this study, the authors test web surveys through nonprobability online panels as a supplemental means to implement cognitive interviewing techniques. The overall goal is to tackle the above-mentioned challenges. The focus in this article is on methodological features that pave the way for an eventual successful implementation of category-selection probing in web surveys. The study reports on the results of 1,023 respondents from Germany. In order to identify implementation features that lead to a high number of meaningful answers, the authors explore the effects of (1) different panels, (2) different probing variants, and (3) different numbers of preceding probes on answer quality. The overall results suggest that category-selection probing can indeed be implemented in web surveys. Using data from two panels-a community panel where members can actively get involved, for example, by creating their own polls, and a "conventional" panel where answering surveys is the members' only activity-the authors find that high community involvement does not increase the likelihood to answer probes or produce longer statements. Testing three probing variants that differ in wording and provided context, the authors find that presenting the context of the probe (i.e., the probed item and the respondent's answer) produces a higher number of meaningful answers. Finally, the likelihood to answer a probe decreases with the number of preceding probes. However, the word count of those who eventually answer the probes slightly increases with an increasing number of probes.
引用
收藏
页码:487 / 498
页数:12
相关论文
共 19 条
[1]  
[Anonymous], 2005, ZUMA HOW TO REIHE
[2]   Research Synthesis [J].
Baker, Reg ;
Blumberg, Stephen J. ;
Brick, J. Michael ;
Couper, Mick P. ;
Courtright, Melanie ;
Dennis, J. Michael ;
Dillman, Don ;
Frankel, Martin R. ;
Garland, Philip ;
Groves, Robert M. ;
Kennedy, Courtney ;
Krosnick, Jon ;
Lavrakas, Paul J. ;
Lee, Sunghee ;
Link, Michael ;
Piekarski, Linda ;
Rao, Kumar ;
Thomas, Randall K. ;
Zahs, Dan .
PUBLIC OPINION QUARTERLY, 2010, 74 (04) :711-781
[3]   Research synthesis: The practice of cognitive interviewing [J].
Beatty, Paul C. ;
Willis, Gordon B. .
PUBLIC OPINION QUARTERLY, 2007, 71 (02) :287-311
[4]  
Behr D., FIELD METHO IN PRESS
[5]  
Blair J., 2006, AAPOR C MONTR CAN MA
[6]  
Braun M., 2011, ASSESSING C IN PRESS
[7]   Helping respondents get it right the first time: The influence of words, symbols, and graphics in web surveys [J].
Christian, Leah Melani ;
Dillman, Don A. ;
Smyth, Jolene D. .
PUBLIC OPINION QUARTERLY, 2007, 71 (01) :113-125
[8]   Sources of Error in Cognitive Interviews [J].
Conrad, Frederick G. ;
Blair, Johnny .
PUBLIC OPINION QUARTERLY, 2009, 73 (01) :32-55
[9]  
Conrad FrederickG., 2006, Journal of Official Statistics, V22, P245
[10]   The length of responses to open-ended questions - A comparison of online and paper questionnaires in terms of a mode effect [J].
Denscombe, Martyn .
SOCIAL SCIENCE COMPUTER REVIEW, 2008, 26 (03) :359-368