Finding a Comparison Group: Is Online Crowdsourcing a Viable Option?

被引:23
作者
Azzam, Tarek [1 ]
Jacobson, Miriam R. [1 ]
机构
[1] Claremont Grad Univ, Claremont, CA 91711 USA
关键词
matched comparison group; crowdsourcing; quasi-experimental design; experimental design; evaluation methods;
D O I
10.1177/1098214013490223
中图分类号
C [社会科学总论];
学科分类号
03 ; 0303 ;
摘要
This article explores the viability of online crowdsourcing for creating matched-comparison groups. This exploratory study compares survey results from a randomized control group to survey results from a matched-comparison group created from Amazon.com's MTurk crowdsourcing service to determine their comparability. Study findings indicate that online crowdsourcing, a process that allows access to many participants to complete specific tasks, is a potentially viable resource for evaluation designs where access to comparison groups, large budgets, and/or time are limited. The article highlights the strengths and limitations of the online crowdsourcing approach and describes ways that it could potentially be used in evaluation practice.
引用
收藏
页码:372 / 384
页数:13
相关论文
共 28 条
[1]  
[Anonymous], 2002, EXPT QUASIEXPERIMENT
[2]  
[Anonymous], COMMON FACTOR SOLUTI
[3]  
[Anonymous], WIRED MAGAZINE
[4]  
[Anonymous], 2010, P 11 ACM C EL COMM 2
[5]  
[Anonymous], 2010, EXPT EC
[6]  
[Anonymous], CANADIAN J PROGRAM E
[7]   The viability of crowdsourcing for survey research [J].
Behrend, Tara S. ;
Sharek, David J. ;
Meade, Adam W. ;
Wiebe, Eric N. .
BEHAVIOR RESEARCH METHODS, 2011, 43 (03) :800-813
[8]   Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk [J].
Berinsky, Adam J. ;
Huber, Gregory A. ;
Lenz, Gabriel S. .
POLITICAL ANALYSIS, 2012, 20 (03) :351-368
[9]  
Bloom H. S., 2005, Learning more from social experiments, P173
[10]   Amazon's Mechanical Turk: A New Source of Inexpensive, Yet High-Quality, Data? [J].
Buhrmester, Michael ;
Kwang, Tracy ;
Gosling, Samuel D. .
PERSPECTIVES ON PSYCHOLOGICAL SCIENCE, 2011, 6 (01) :3-5