The first Italian research assessment exercise: A bibliometric perspective

被引:73
作者
Franceschet, Massimo [1 ]
Costantini, Antonio [2 ]
机构
[1] Univ Udine, Dept Math & Comp Sci, I-33100 Udine, Italy
[2] Univ Udine, Dept Agr & Environm Sci, I-33100 Udine, Italy
关键词
Research assessment; Peer review; Bibliometrics; CITATION DISTRIBUTIONS; INDICATORS; INDEX; UNIVERSALITY; VALIDATION; METRICS;
D O I
10.1016/j.joi.2010.12.002
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
In December 2003, seventeen years after the first UK research assessment exercise, Italy started up its first-ever national research evaluation, with the aim to evaluate, using the peer review method, the excellence of the national research production. The evaluation involved 20 disciplinary areas, 102 research structures, 18,500 research products and 6661 peer reviewers (1465 from abroad); it had a direct cost of 3.55 millions Euros and a time length spanning over 18 months. The introduction of ratings based on ex post quality of output and not on ex ante respect for parameters and compliance is an important leap forward of the national research evaluation system toward meritocracy. From the bibliometric perspective, the national assessment offered the unprecedented opportunity to perform a large-scale comparison of peer review and bibliometric indicators for an important share of the Italian research production. The present investigation takes full advantage of this opportunity to test whether peer review judgements and (article and journal) bibliometric indicators are independent variables and, in the negative case, to measure the sign and strength of the association. Outcomes allow us to advocate the use of bibliometric evaluation, suitably integrated with expert review, for the forthcoming national assessment exercises, with the goal of shifting from the assessment of research excellence to the evaluation of average research performance without significant increase of expenses. (C) 2010 Elsevier Ltd. All rights reserved.
引用
收藏
页码:275 / 291
页数:17
相关论文
共 41 条
[1]   Allocative efficiency in public research funding: Can bibliometrics help? [J].
Abramo, Giovanni ;
D'Angelo, Ciriaco Andrea ;
Caprasecca, Alessandro .
RESEARCH POLICY, 2009, 38 (01) :206-215
[2]   Peer reviews and bibliometric indicators: a comparative study at a Norwegian university [J].
Aksnes, DW ;
Taxt, RE .
RESEARCH EVALUATION, 2004, 13 (01) :33-41
[3]   h-Index: A review focused in its variants, computation and standardization for different scientific fields [J].
Alonso, S. ;
Cabrerizo, F. J. ;
Herrera-Viedma, E. ;
Herrera, F. .
JOURNAL OF INFORMETRICS, 2009, 3 (04) :273-289
[4]   Differences in Impact Factor Across Fields and Over Time [J].
Althouse, Benjamin M. ;
West, Jevin D. ;
Bergstrom, Carl T. ;
Bergstrom, Theodore .
JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, 2009, 60 (01) :27-34
[5]  
[Anonymous], CITATION ANAL RES EV
[6]   Achievement index climbs the ranks [J].
Ball, Philip .
NATURE, 2007, 448 (7155) :737-737
[7]  
Bleiklie I., 1998, European Journal of Education, V33, P299
[8]   Convergent validation of peer review decisions using the h index -: Extent of and reasons for type I and type II errors [J].
Bornmann, Lutz ;
Daniel, Hans-Dieter .
JOURNAL OF INFORMETRICS, 2007, 1 (03) :204-213
[9]   What do we know about the h index? [J].
Bornmann, Lutz ;
Daniel, Hans-Dieter .
JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, 2007, 58 (09) :1381-1385
[10]   Scientific Peer Review [J].
Bornmann, Lutz .
ANNUAL REVIEW OF INFORMATION SCIENCE AND TECHNOLOGY, 2011, 45 :199-245