Do altmetrics assess societal impact in a comparable way to case studies? An empirical test of the convergent validity of altmetrics based on data from the UK research excellence framework (REF)

被引:92
作者
Bornmann, Lutz [1 ]
Haunschild, Robin [2 ]
Adams, Jonathan [3 ,4 ]
机构
[1] Max Planck Gesell, Adm Headquarters, Div Sci & Innovat Studies, Hofgartenstr 8, D-80539 Munich, Germany
[2] Max Planck Inst Solid State Res, Heisenbergstr 1, D-70569 Stuttgart, Germany
[3] ISI Clarivate Analyt, 160 Blackfriars Rd, London, England
[4] Kings Coll London, Policy Inst King, 22 Kingsway, London WC2B 6LE, England
基金
美国国家卫生研究院;
关键词
Bibliometrics; Altmetrics; MHq; Societal impact; Case studies; Research excellence framework; REF2014; OF-THE-ART; SOCIAL MEDIA; INDICATORS; CITATIONS; WEB; DOCUMENTS; WIKIPEDIA; ARTICLES; SCIENCE; BLOG;
D O I
10.1016/j.joi.2019.01.008
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Altmetrics have been proposed as a way to assess the societal impact of research. Although altmetrics are already in use as impact or attention metrics in different contexts, it is still not clear whether they really capture or reflect societal impact. This study is based on altmetrics, citation counts, research output and case study data from the UK Research Excellence Framework (REF), and peers' REF assessments of research output and societal impact. We investigated the convergent validity of altmetrics by using two REF datasets: publications submitted as research output (PRO) to the REF and publications referenced in case studies (PCS). Case studies, which are intended to demonstrate societal impact, should cite the most relevant research papers. We used the MHq' indicator for assessing impact - an indicator which has been introduced for count data with many zeros. The results of the first part of the analysis show that news media as well as mentions on Facebook, in blogs, in Wikipedia, and in policy-related documents have higher MHq' values for PCS than for PRO. Thus, the altmetric indicators seem to have convergent validity for these data. In the second part of the analysis, altmetrics have been correlated with REF reviewers' average scores on PCS. The negative or close to zero correlations question the convergent validity of altmetrics in that context. We suggest that they may capture a different aspect of societal impact (which can be called unknown attention) to that seen by reviewers (who are interested in the causal link between research and action in society). (C) 2019 Elsevier Ltd. All rights reserved.
引用
收藏
页码:325 / 340
页数:16
相关论文
共 106 条
[1]  
Adie E., 2014, LABORMORYNEWS
[2]   TAKING THE ALTERNATIVE MAINSTREAM [J].
Adie, Euan .
PROFESIONAL DE LA INFORMACION, 2014, 23 (04) :349-351
[3]   How do we define the policy impact of public health research? A systematic review [J].
Alla, Kristel ;
Hall, Wayne D. ;
Whiteford, Harvey A. ;
Head, Brian W. ;
Meurk, Carla S. .
HEALTH RESEARCH POLICY AND SYSTEMS, 2017, 15
[4]   Social Media Release Increases Dissemination of Original Articles in the Clinical Pain Sciences [J].
Allen, Heidi G. ;
Stanton, Tasha R. ;
Di Pietro, Flavia ;
Moseley, G. Lorimer .
PLOS ONE, 2013, 8 (07)
[5]  
Andersen J. P., 2015, 15 C INT SOC SCI INF, P26
[6]  
[Anonymous], 2015, NAT SCAL BEN RES IMP
[7]  
[Anonymous], 2004, The Measurement of Interrater Agreement. Statistical Methods for Rates and Proportions
[8]  
[Anonymous], 2016, Introduction to the new statistics: Estimation, open science, and beyond
[9]  
[Anonymous], PROD INT SOC IMP AC
[10]  
[Anonymous], 2017, Web indicators for research evaluation: A practical guide