Method and reporting quality in health professions education research: a systematic review

被引:102
作者
Cook, David A. [1 ,2 ]
Levinson, Anthony J. [3 ]
Garside, Sarah [3 ]
机构
[1] Mayo Clin, Div Gen Internal Med, Coll Med, Rochester, MN 55905 USA
[2] Mayo Clin, Coll Med, Off Educ Res, Rochester, MN 55905 USA
[3] McMaster Univ, Michael G DeGroote Sch Med, Div E Learning Innovat, Hamilton, ON, Canada
关键词
MEDICAL-EDUCATION; RANDOMIZED-TRIALS; CONSORT STATEMENT; INTERVENTIONS; ABSTRACTS; EPIDEMIOLOGY; VALIDITY; BIAS;
D O I
10.1111/j.1365-2923.2010.03890.x
中图分类号
G40 [教育学];
学科分类号
040101 [教育学原理];
摘要
Context Studies evaluating reporting quality in health professions education (HPE) research have demonstrated deficiencies, but none have used comprehensive reporting standards. Additionally, the relationship between study methods and effect size (ES) in HPE research is unknown. Objectives This review aimed to evaluate, in a sample of experimental studies of Internet-based instruction, the quality of reporting, the relationship between reporting and methodological quality, and associations between ES and study methods. Methods We conducted a systematic search of databases including MEDLINE, Scopus, CINAHL, EMBASE and ERIC, for articles published during 1990-2008. Studies (in any language) quantifying the effect of Internet-based instruction in HPE compared with no intervention or other instruction were included. Working independently and in duplicate, we coded reporting quality using the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement, and coded study methods using a modified Newcastle-Ottawa Scale (m-NOS), the Medical Education Research Study Quality Instrument (MERSQI), and the Best Evidence in Medical Education (BEME) global scale. Results For reporting quality, articles scored a mean +/- standard deviation (SD) of 51 +/- 25% of STROBE elements for the Introduction, 58 +/- 20% for the Methods, 50 +/- 18% for the Results and 41 +/- 26% for the Discussion sections. We found positive associations (all p < 0.0001) between reporting quality and MERSQI (rho = 0.64), m-NOS (rho = 0.57) and BEME (rho = 0.58) scores. We explored associations between study methods and knowledge ES by subtracting each study's ES from the pooled ES for studies using that method and comparing these differences between subgroups. Effect sizes in single-group pretest/post-test studies differed from the pooled estimate more than ESs in two-group studies (p = 0.013). No difference was found between other study methods (yes/no: representative sample, comparison group from same community, randomised, allocation concealed, participants blinded, assessor blinded, objective assessment, high follow-up). Conclusions Information is missing from all sections of reports of HPE experiments. Single-group pre-/post-test studies may overestimate ES compared with two-group designs. Other methodological variations did not bias study results in this sample.
引用
收藏
页码:227 / 238
页数:12
相关论文
共 47 条
[1]
Assessment of the quality of reporting of randomized clinical trials in paediatric dentistry journals [J].
Al-Namankany, Abeer A. ;
Ashley, Paul ;
Moles, David R. ;
Parekh, Susan .
INTERNATIONAL JOURNAL OF PAEDIATRIC DENTISTRY, 2009, 19 (05) :318-324
[2]
[Anonymous], 1963, EXPT QUASIEXPERIMENT
[3]
[Anonymous], 5 INT C PEER REV BIO
[4]
Trends in study methods used in undergraduate medical education research, 1969-2007 [J].
Baernstein, Amy ;
Liss, Hillary K. ;
Carney, Patricia A. ;
Elmore, Joann G. .
JAMA-JOURNAL OF THE AMERICAN MEDICAL ASSOCIATION, 2007, 298 (09) :1038-1045
[5]
Standards of reporting of randomized controlled trials in general surgery - Can we do better? [J].
Balasubramanian, Sabapathy P. ;
Wiener, Martin ;
Alshameeri, Zeiad ;
Tiruvoipati, Ravindranath ;
Elbourne, Diana ;
Reed, Malcolm W. .
ANNALS OF SURGERY, 2006, 244 (05) :663-667
[6]
A comparison of observational studies and randomized, controlled trials. [J].
Benson, K ;
Hartz, AJ .
NEW ENGLAND JOURNAL OF MEDICINE, 2000, 342 (25) :1878-1886
[7]
Extending the CONSORT statement to randomized trials of nonpharmacologic treatment: Explanation and elaboration [J].
Boutron, Isabelle ;
Moher, David ;
Altman, Douglas G. ;
Schulz, Kenneth F. ;
Ravaud, Philippe .
ANNALS OF INTERNAL MEDICINE, 2008, 148 (04) :295-309
[8]
Epidemiology and reporting of randomised trials published in PubMed journals [J].
Chan, AW ;
Altman, DG .
LANCET, 2005, 365 (9465) :1159-1162
[9]
COMPUTER-BASED INSTRUCTION AND HEALTH-PROFESSIONS EDUCATION - A METAANALYSIS OF OUTCOMES [J].
COHEN, PA ;
DACANAY, LS .
EVALUATION & THE HEALTH PROFESSIONS, 1992, 15 (03) :259-281
[10]
The reputation of medical education research: Quasi-experimentation and unresolved threats to validity [J].
Colliver, Jerry A. ;
McGaghie, William C. .
TEACHING AND LEARNING IN MEDICINE, 2008, 20 (02) :101-103