Inter-observer agreement in audit of quality of radiology requests and reports

被引:24
作者
Stavem, K [1 ]
Foss, T
Botnmark, O
Andersen, OK
Erikssen, J
机构
[1] Akershus Univ Hosp, Dept Med, NO-1474 Nordbyhagen, Norway
[2] Akershus Univ Hosp, Dept Gen Surg, NO-1474 Nordbyhagen, Norway
[3] Akershus Univ Hosp, Dept Radiol, NO-1474 Nordbyhagen, Norway
[4] Akershus Univ Hosp, Norwegian Hlth Serv, Res Ctr, NO-1474 Nordbyhagen, Norway
关键词
quality control; medical audit; radiology; reporting systems; peer review; methods;
D O I
10.1016/j.crad.2004.04.002
中图分类号
R8 [特种医学]; R445 [影像诊断学];
学科分类号
1002 ; 100207 ; 1009 ;
摘要
AIMS: To assess the quality of the imaging procedure requests and radiologists' reports using an auditing tool, and to assess the agreement between different observers of the quality parameters. MATERIALS AND METHODS: In an audit using a standardized scoring system, three observers reviewed request forms for 296 consecutive radiological examinations, and two observers reviewed a random sample of 150 of the corresponding radiologists' reports. We present descriptive statistics from the audit and pairwise inter-observer agreement, using the proportion agreement and kappa statistics. RESULTS: The proportion of acceptable item scores (0 or +1) was above 70% for all items except the requesting physician's bleep or extension number, legibility of the physician's name, or details about previous investigations. For pairs of observers, the inter-observer agreement was generally high, however, the corresponding kappa values were consistently low with only 14 of 90 ratings >0.60 and 6 >0.80 on the requests/reports. For the quality of the clinical information, the appropriateness of the request, and the requested priority/timing of the investigation items, the mean percentage agreement ranged 67-76, and the corresponding kappa values ranged 0.08-0.24. CONCLUSION: The inter-observer reliability of scores on the different items showed a high degree of agreement, although the kappa values were low, which is a well-known paradox. Current routines for requesting radiology examinations appeared satisfactory, although several problem areas were identified. (C) 2004 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
引用
收藏
页码:1018 / 1024
页数:7
相关论文
共 20 条
[1]  
Barrau V, 2001, J RADIOL, V82, P897
[2]   Dependence of weighted kappa coefficients on the number of categories [J].
Brenner, H ;
Kliebsch, U .
EPIDEMIOLOGY, 1996, 7 (02) :199-202
[3]   The relationship between oncologists and peripheral hospital radiologists in the North-west of England [J].
Bungay, PM ;
Carrington, BM ;
Corgié, D ;
Eardley, A .
CLINICAL RADIOLOGY, 2002, 57 (04) :300-304
[4]   HOW GOOD ARE CASE NOTES IN THE AUDIT OF RADIOLOGICAL INVESTIGATIONS [J].
CHARNY, MC ;
ROBERTS, GM ;
BECK, P ;
WEBSTER, DJT ;
ROBERTS, CJ .
CLINICAL RADIOLOGY, 1990, 42 (02) :118-121
[5]   HIGH AGREEMENT BUT LOW KAPPA .2. RESOLVING THE PARADOXES [J].
CICCHETTI, DV ;
FEINSTEIN, AR .
JOURNAL OF CLINICAL EPIDEMIOLOGY, 1990, 43 (06) :551-558
[6]   AUDIT OF QUALITY IN A RADIOLOGICAL DEPARTMENT - A PILOT-STUDY [J].
COOK, PG ;
BIRCHALL, IWJ ;
JEANS, WD .
CLINICAL RADIOLOGY, 1991, 44 (05) :345-349
[7]   Conditional inference for subject-specific and marginal agreement: Two families of agreement measures [J].
Cook, RJ ;
Farewell, VT .
CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 1995, 23 (04) :333-344
[8]   HIGH AGREEMENT BUT LOW KAPPA .1. THE PROBLEMS OF 2 PARADOXES [J].
FEINSTEIN, AR ;
CICCHETTI, DV .
JOURNAL OF CLINICAL EPIDEMIOLOGY, 1990, 43 (06) :543-549
[9]   Bias and prevalence effects on kappa viewed in terms of sensitivity and specificity [J].
Hoehler, FK .
JOURNAL OF CLINICAL EPIDEMIOLOGY, 2000, 53 (05) :499-503
[10]   Appropriateness of imaging procedure requests: Do radiologists agree? [J].
Kahn, CE ;
Michalski, TA ;
Erickson, SJ ;
Foley, WD ;
Krasnow, AZ ;
Lofgren, RP ;
Quiroz, FA ;
Rand, SD .
AMERICAN JOURNAL OF ROENTGENOLOGY, 1997, 169 (01) :11-14