Multi-institutional validation of a web-based core competency assessment system.J Surg Educ. 2007 Nov-Dec; 64(6):390-4.JS
The Association of Program Directors in Surgery and the Division of Education of the American College of Surgeons developed and implemented a web-based system for end-of-rotation faculty assessment of ACGME core competencies of residents. This study assesses its reliability and validity across multiple programs.
Each assessment included ratings (1-5 scale) on 23 items reflecting the 6 core competencies. A total of 4241 end-of-rotation assessments were completed for 332 general surgery residents (> or =5 evaluations each) at 5 sites during the 2004-2005 and 2005-2006 academic years. The mean rating for each resident on each item was computed for each academic year. The mean rating of items representing each competency was computed for each resident. Additional data included USMLE and ABSITE scores, PGY, and status in program (categorical, designated preliminary, and undesignated preliminary).
Coefficient alpha was greater than 0.90 for each competency score. Mean ratings for each competency increased significantly (p < 0.01) as a function of PGY. Mean ratings for professionalism and interpersonal/communication skills (IPC) were significantly higher than all other competencies at all PGY levels. Competency ratings of PGY 1 residents correlated significantly with USMLE Step I, ranging from (r = 0.26, p < 0.01) for Professionalism to (r = 0.41, p < 0.001) for Systems-Based Practice. Ratings of Knowledge (r = 0.31, p < 0.01), Practice-Based Learning & Improvement (PBLI; r = 0.22, p < 0.05), and Systems-Based Practice (r = 0.20, p < 0.05) correlated significantly with 2005 ABSITE Total Percentile. Ratings of all competencies correlated significantly with the 2006 ABSITE Total Percentile Score (range: r = 0.20, p < 0.05 for professionalism to r = 0.35, p < 0.001 for knowledge). Categorical and designated preliminary residents received significantly higher ratings (p < 0.05) than nondesignated preliminaries for knowledge, patient care, PBLI, and systems-based practice only.
Faculty ratings of core competencies are internally consistent. The pattern of statistically significant correlations between competency ratings and USMLE and ABSITE scores supports the postdictive and concurrent validity, respectively, of faculty perceptions of resident knowledge. The pattern of increased ratings as a function of PGY supports the construct validity of faculty ratings of resident core competencies.