Multirater agreement of arthroscopic meniscal lesions

被引:52
作者
Dunn, WR
Wolf, BR
Amendola, A
Andrish, JT
Kaeding, C
Marx, RG
McCarty, EC
Parker, RD
Wright, RW
Spindler, KP
机构
[1] Vanderbilt Sports Med Ctr, Nashville, TN 37212 USA
[2] Hosp Special Surg, New York, NY 10021 USA
[3] Univ Iowa Hosp & Clin, Iowa City, IA 52242 USA
[4] Cleveland Clin Fdn, Cleveland, OH 44195 USA
[5] Ohio State Sports Med Ctr, Columbus, OH USA
[6] Colorado Univ Sports Med, Denver, CO USA
[7] Washington Univ, Orthoped & Sports Med Ctr, St Louis, MO USA
关键词
multicenter; meniscus; multirater agreement; Multicenter Orthpaedic Outcomes Network (MOON);
D O I
10.1177/0363546504264586
中图分类号
R826.8 [整形外科学]; R782.2 [口腔颌面部整形外科学]; R726.2 [小儿整形外科学]; R62 [整形外科学(修复外科学)];
学科分类号
摘要
Background: Establishing the validity of classification schemes is a crucial preparatory step that should precede multicenter studies. There are no studies investigating the reproducibility of arthroscopic classification of meniscal pathology among multiple surgeons at different institutions. Hypothesis: Arthroscopic classification of meniscal pathology is reliable and reproducible and suitable for multicenter studies that involve multiple surgeons. Study Design: Multirater agreement study. Methods: Seven surgeons reviewed a video of 18 meniscal tears and completed a meniscal classification questionnaire. Multirater agreement was calculated based on the proportion of agreement, the kappa coefficient, and the intraclass correlation coefficient. Results: There was a 46% agreement on the central/peripheral location of tears (kappa = 0.30), an 80% agreement on the depth of tears (kappa = 0.46), a 72% agreement on the presence of a degenerative component (kappa = 0.44), a 71% agreement on whether lateral tears were central to the popliteal hiatus (kappa = 0.42), a 73% agreement on the type of tear (kappa = 0.63), an 87% agreement on the location of the tear (kappa = 0.61), and an 84% agreement on the treatment of tears (kappa = 0.66). There was considerable agreement among surgeons on length, with an intraclass correlation coefficient of 0.78, 95% confidence interval of 0.57 to 0.92, and P < .001. Conclusions: Arthroscopic grading of meniscal pathology is reliable and reproducible. Clinical Relevance: Surgeons can reliably classify meniscal pathology and agree on treatment, which is important for multicenter trials.
引用
收藏
页码:1937 / 1940
页数:4
相关论文
共 13 条
[1]
Observer reliability in the arthroscopic classification of osteoarthritis of the knee [J].
Brismar, BH ;
Wredmark, T ;
Movin, T ;
Leandersson, J ;
Svensson, O .
JOURNAL OF BONE AND JOINT SURGERY-BRITISH VOLUME, 2002, 84B :42-47
[2]
PROBLEMS WITH KAPPA [J].
BYRT, T .
JOURNAL OF CLINICAL EPIDEMIOLOGY, 1992, 45 (12) :1452-1452
[3]
BIAS, PREVALENCE AND KAPPA [J].
BYRT, T ;
BISHOP, J ;
CARLIN, JB .
JOURNAL OF CLINICAL EPIDEMIOLOGY, 1993, 46 (05) :423-429
[4]
Reproducibility and reliability of the outerbridge classification for grading chondral lesions of the knee arthroscopically [J].
Cameron, ML ;
Briggs, KK ;
Steadman, JR .
AMERICAN JOURNAL OF SPORTS MEDICINE, 2003, 31 (01) :83-86
[5]
HIGH AGREEMENT BUT LOW KAPPA .2. RESOLVING THE PARADOXES [J].
CICCHETTI, DV ;
FEINSTEIN, AR .
JOURNAL OF CLINICAL EPIDEMIOLOGY, 1990, 43 (06) :551-558
[6]
[7]
KNEE JOINT CHANGES AFTER MENISCECTOMY [J].
FAIRBANK, TJ .
JOURNAL OF BONE AND JOINT SURGERY-BRITISH VOLUME, 1948, 30 (04) :664-670
[8]
HIGH AGREEMENT BUT LOW KAPPA .1. THE PROBLEMS OF 2 PARADOXES [J].
FEINSTEIN, AR ;
CICCHETTI, DV .
JOURNAL OF CLINICAL EPIDEMIOLOGY, 1990, 43 (06) :543-549
[9]
Interobserver variations in intra-articular evaluation during arthroscopy of the knee [J].
Javed, A ;
Siddique, M ;
Vaghela, M ;
Hui, ACW .
JOURNAL OF BONE AND JOINT SURGERY-BRITISH VOLUME, 2002, 84B :48-49
[10]
MEASUREMENT OF OBSERVER AGREEMENT FOR CATEGORICAL DATA [J].
LANDIS, JR ;
KOCH, GG .
BIOMETRICS, 1977, 33 (01) :159-174