This investigation examines a competency-based clinical skills assessment program for surgical clerks using checklists and rating forms for precise measurement of physical exam (PE) skills, physician-patient interaction (PPI) skills, and patient write-up (PW) skills. Analysis of variance demonstrated improvement in PW skills across the academic year when measured by the rating instrument, but this improvement was not detected on traditional subjective rating forms (SRF). PPI skills improved between first rotations across 2 academic years with the addition of orientation to expectations (mean, 79% versus 92%, P = 0.000). Poor correlation was noted between the National Board of Medical Examiners Surgery Subtest scores and PE skills (r = .19), PW skills (r = .20), and PPI skills (r = .15). While the overall ratings given by faculty on SRF correlated with the SRF ratings of PE skills (r = .77) and PPI skills (r = .58), these same faculty ratings correlated poorly with these skills as assessed by checklist (r = .16 and r = .14, respectively). This pilot experience demonstrates that PE skills, PW skills, and PPI skills (1) improve only with orientation to expectations and feedback, (2) correlate poorly with fund of knowledge assessment, and (3) are best assessed with precise measurement (eg, checklist, direct observation), which avoids the halo effect of overall evaluation that occurs with subjective rating forms.