Evaluation of the Neer system of classification of proximal humeral fractures with computerized tomographic scans and plain radiographs

被引:153
作者
Bernstein, J
Adler, LM
Blank, JE
Dalsey, RM
Williams, GR
Iannotti, JP
机构
[1] Department of Orthopedic Surgery, University of Pennsylvania, School of Medicine, Philadelphia, PA 19104
关键词
D O I
10.2106/00004623-199609000-00012
中图分类号
R826.8 [整形外科学]; R782.2 [口腔颌面部整形外科学]; R726.2 [小儿整形外科学]; R62 [整形外科学(修复外科学)];
学科分类号
摘要
The intraobserver reliability and interobserver reproducibility of the Neer classification system were assessed on the basis of the plain radiographs and computerized tomographic scans of twenty fractures of the proximal part of the humerus, To determine if the observers had difficulty agreeing only about the degree of displacement or angulation (but could determine which segments were fractured), a modified system (in which fracture lines were considered but displacement was not) also was assessed, Finally, the observers were asked to recommend a treatment for the fracture, and the reliability and reproducibility of that decision were measured, The radiographs and computerized tomographic scans were viewed on two occasions by four observers, including two residents in their fifth year of postgraduate study and two fellowship-trained shoulder surgeons, Kappa coefficients then were calculated, The mean kappa coefficient for intraobserver reliability was 0.64 when the fractures were assessed with radiographs alone, 0.72 when they were assessed with radiographs and computerized tomographic scans, 0.68 when they were classified according to the modified system in which displacement and angulation were not considered, and 0.84 for treatment recommendations; the mean kappa coefficients for interobserver reproducibility were 0.52, 0.50, 0.56, and 0.65, respectively, The interobserver reproducibility of the responses of the attending surgeons regarding diagnosis and treatment did not change when the fractures were classified with use of computerized tomographic scans in addition to radiographs or with use of the modified system in which displacement and angulation were not considered; the mean kappa coefficient was 0.64 for all such comparisons, Over-all, the addition of computerized tomographic scans was associated with a slight increase in intraobserver reliability but no increase in interobserver reproducibility, The classification of fractures of the shoulder remains difficult because even experts cannot uniformly agree about which fragments are fractured, Because of this underlying difficulty, optimum patient care might require the development of new imaging modalities and not necessarily new classification systems.
引用
收藏
页码:1371 / 1375
页数:5
相关论文
共 8 条
[1]  
BERNSTEIN J, 1994, J BONE JOINT SURG AM, V76A, P792
[2]  
BIGLIANI LU, 1994, J BONE JOINT SURG AM, V76A, P790
[4]  
Dunn G., 1989, Design and analysis of reliability studies: the statistical evaluation of measurement errors
[5]   MEASUREMENT OF OBSERVER AGREEMENT FOR CATEGORICAL DATA [J].
LANDIS, JR ;
KOCH, GG .
BIOMETRICS, 1977, 33 (01) :159-174
[6]  
NEER CS, 1987, CLIN ORTHOP RELAT R, P3
[7]   THE NEER-CLASSIFICATION-SYSTEM FOR PROXIMAL HUMERAL FRACTURES - AN ASSESSMENT OF INTEROBSERVER RELIABILITY AND INTRAOBSERVER REPRODUCIBILITY [J].
SIDOR, ML ;
ZUCKERMAN, JD ;
LYON, T ;
KOVAL, K ;
CUOMO, F ;
SCHOENBERG, N .
JOURNAL OF BONE AND JOINT SURGERY-AMERICAN VOLUME, 1993, 75A (12) :1745-1750
[8]   THE REPRODUCIBILITY OF CLASSIFICATION OF FRACTURES OF THE PROXIMAL END OF THE HUMERUS [J].
SIEBENROCK, KA ;
GERBER, C .
JOURNAL OF BONE AND JOINT SURGERY-AMERICAN VOLUME, 1993, 75A (12) :1751-1755