A measure of association for ordered categorical data in population-based studies

被引:12
作者
Nelson, Kerrie P. [1 ]
Edwards, Don [2 ]
机构
[1] Boston Univ, Dept Biostat, 801 Massachusetts Ave, Boston, MA 02118 USA
[2] Univ South Carolina, Dept Stat, Columbia, SC USA
基金
美国国家卫生研究院;
关键词
Agreement; association; crossed random effects; generalized linear mixed model; ordinal classifications; weighted kappa; WEIGHTED KAPPA; INTEROBSERVER REPRODUCIBILITY; PROSTATIC-CARCINOMA; AGREEMENT; MODEL; CLASSIFICATION; COEFFICIENT; VARIABILITY; RELIABILITY; FRACTURES;
D O I
10.1177/0962280216643347
中图分类号
R19 [保健组织与事业(卫生事业管理)];
学科分类号
摘要
Ordinal classification scales are commonly used to define a patient's disease status in screening and diagnostic tests such as mammography. Challenges arise in agreement studies when evaluating the association between many raters' classifications of patients' disease or health status when an ordered categorical scale is used. In this paper, we describe a population-based approach and chance-corrected measure of association to evaluate the strength of relationship between multiple raters' ordinal classifications where any number of raters can be accommodated. In contrast to Shrout and Fleiss' intraclass correlation coefficient, the proposed measure of association is invariant with respect to changes in disease prevalence. We demonstrate how unique characteristics of individual raters can be explored using random effects. Simulation studies are conducted to demonstrate the properties of the proposed method under varying assumptions. The methods are applied to two large-scale agreement studies of breast cancer screening and prostate cancer severity.
引用
收藏
页码:812 / 831
页数:20
相关论文
共 55 条
[1]   A MODEL FOR AGREEMENT BETWEEN RATINGS ON AN ORDINAL SCALE [J].
AGRESTI, A .
BIOMETRICS, 1988, 44 (02) :539-548
[2]  
Agresti A, 2010, Analysis of ordinal categorical data, V2nd, DOI 10.1002/ 9780470594001
[3]   Interobserver reproducibility of Gleason grading of prostatic carcinoma: Urologic pathologists [J].
Allsbrook, WC ;
Mangold, KA ;
Johnson, MH ;
Lane, RB ;
Lane, CG ;
Amin, MB ;
Bostwick, DG ;
Humphrey, PA ;
Jones, EC ;
Reuter, VE ;
Sakr, W ;
Sesterhenn, IA ;
Troncoso, P ;
Wheeler, TM ;
Epstein, JI .
HUMAN PATHOLOGY, 2001, 32 (01) :74-80
[4]   Interobserver reproducibility of Gleason grading of prostatic carcinoma: General pathologists [J].
Allsbrook, WC ;
Mangold, KA ;
Johnson, MH ;
Lane, RB ;
Lane, CG ;
Epstein, JI .
HUMAN PATHOLOGY, 2001, 32 (01) :81-88
[5]  
[Anonymous], 1999, Ordinal data modeling
[6]   Beyond kappa: A review of interrater agreement measures [J].
Banerjee, M .
CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 1999, 27 (01) :3-23
[7]   Association of volume and volume-independent factors with accuracy in screening mammogram interpretation [J].
Beam, CA ;
Conant, EF ;
Sickles, EA .
JOURNAL OF THE NATIONAL CANCER INSTITUTE, 2003, 95 (04) :282-290
[8]   LOG-LINEAR MODELING OF PAIRWISE INTEROBSERVER AGREEMENT ON A CATEGORICAL SCALE [J].
BECKER, MP ;
AGRESTI, A .
STATISTICS IN MEDICINE, 1992, 11 (01) :101-114
[9]   WEIGHTED KAPPA FOR MULTIPLE RATERS [J].
Berry, Kenneth J. ;
Johnston, Janis E. ;
Mielke, Paul W., Jr. .
PERCEPTUAL AND MOTOR SKILLS, 2008, 107 (03) :837-848
[10]   2X2 KAPPA-COEFFICIENTS - MEASURES OF AGREEMENT OR ASSOCIATION [J].
BLOCH, DA ;
KRAEMER, HC .
BIOMETRICS, 1989, 45 (01) :269-287