Journal peer review as an information retrieval process

被引:2
作者
Bornmann, Lutz [1 ]
Egghe, Leo [2 ,3 ]
机构
[1] Max Planck Gesell, Munich, Germany
[2] Univ Antwerp, B-2020 Antwerp, Belgium
[3] Univ Hasselt, Diepenbeek, Belgium
关键词
Information science; Information retrieval; Periodicals; Peer review; PRECISION; FALLOUT; NUMBER; RECALL;
D O I
10.1108/00220411211239093
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Purpose - In editorial peer review systems of journals, one does not always accept the best papers. Due to different human perceptions, the evaluation of papers by peer review (for a journal) can be different from the impact that a paper has after its publication (measured by number of citations received) in this or another journal. This system (and corresponding problems) is similar to the information retrieval process in a documentary system. Also there, one retrieves not always the most relevant documents for a certain topic. This is so because the topic is described in the command language of the documentary system and this command does not always completely cover the "real topic" that one wants to describe. This paper aims to address this issue. Design/methodology/approach - Based on this statement classical information retrieval evaluation techniques were applied to the evaluation of peer review systems. Basic in such an information retrieval evaluation are the notions of precision and recall and the precision-recall-curve. Such notions are introduced here for the evaluation of peer review systems. Findings - The analogues of precision and recall are defined and their curve constructed based on peer review data from the journal Angewandte Chemie - International Edition and on citation impact data of accepted papers by this journal or rejected but published elsewhere papers. It is concluded that, clue to the imperfect peer review process (based on human evaluation), if we want to publish a high amount of qualified papers (the ones we seek), several non-qualified papers should also be accepted. Originality/value - The authors conclude that, due to the imperfect peer review process (based on human evaluation), if we want to publish a high amount of qualified papers (the ones we seek), one will also accept several non-qualified papers.
引用
收藏
页码:527 / 535
页数:9
相关论文
共 16 条
[1]   Scientific Peer Review [J].
Bornmann, Lutz .
ANNUAL REVIEW OF INFORMATION SCIENCE AND TECHNOLOGY, 2011, 45 :199-245
[2]  
Bornmann L, 2010, PLOS ONE, V5, DOI [10.1371/journal.pone.0011344, 10.1371/journal.pone.0013327]
[3]   The manuscript reviewing process: Empirical research on review requests, review sequences, and decision rules in peer review [J].
Bornmann, Lutz ;
Daniel, Hans-Dieter .
LIBRARY & INFORMATION SCIENCE RESEARCH, 2010, 32 (01) :5-12
[4]   Extent of type I and type II errors in editorial decisions: A case study on Angewandte Chemie International Edition [J].
Bornmann, Lutz ;
Daniel, Hans-Dieter .
JOURNAL OF INFORMETRICS, 2009, 3 (04) :348-352
[5]   The luck of the referee draw: the effect of exchanging reviews [J].
Bornmann, Lutz ;
Daniel, Hans-Dieter .
LEARNED PUBLISHING, 2009, 22 (02) :117-125
[6]   The measures precision, recall, fallout and miss as a function of the number of retrieved documents and their mutual interrelations [J].
Egghe, L. .
INFORMATION PROCESSING & MANAGEMENT, 2008, 44 (02) :856-876
[7]   Existence theorem of the quadruple (P, R, F, M):: Precision, recall, fallout and miss [J].
Egghe, L. .
INFORMATION PROCESSING & MANAGEMENT, 2007, 43 (01) :265-272
[8]  
Kashima H, 2006, IEEE DATA MINING, P340
[9]  
Popescul A., 2003, Proceedings of the 2nd Workshop on Multi-Relational Data Mining MRDM-2003, P92
[10]  
Salton G., 1987, INTRO MODERN INFORM