Death to Kappa: birth of quantity disagreement and allocation disagreement for accuracy assessment

被引:1425
作者
Pontius, Robert Gilmore, Jr. [1 ]
Millones, Marco [1 ]
机构
[1] Clark Univ, Sch Geog, Worcester, MA 01610 USA
基金
美国国家科学基金会;
关键词
THEMATIC CLASSIFICATION ACCURACY; LAND-COVER; CATEGORICAL MAPS; AGREEMENT; COEFFICIENT; MODELS; LOCATION;
D O I
10.1080/01431161.2011.552923
中图分类号
TP7 [遥感技术];
学科分类号
081102 ; 0816 ; 081602 ; 083002 ; 1404 ;
摘要
The family of Kappa indices of agreement claim to compare a map's observed classification accuracy relative to the expected accuracy of baseline maps that can have two types of randomness: (1) random distribution of the quantity of each category and (2) random spatial allocation of the categories. Use of the Kappa indices has become part of the culture in remote sensing and other fields. This article examines five different Kappa indices, some of which were derived by the first author in 2000. We expose the indices' properties mathematically and illustrate their limitations graphically, with emphasis on Kappa's use of randomness as a baseline, and the often-ignored conversion from an observed sample matrix to the estimated population matrix. This article concludes that these Kappa indices are useless, misleading and/or flawed for the practical applications in remote sensing that we have seen. After more than a decade of working with these indices, we recommend that the profession abandon the use of Kappa indices for purposes of accuracy assessment and map comparison, and instead summarize the cross-tabulation matrix with two much simpler summary parameters: quantity disagreement and allocation disagreement. This article shows how to compute these two parameters using examples taken from peer-reviewed literature.
引用
收藏
页码:4407 / 4429
页数:23
相关论文
共 51 条