Expert levels can be developed for use as training end points for a basic video-trainer skills curriculum, and the levels developed will be suitable for training.
Fifty subjects with minimal prior simulator exposure were enrolled using an institutional review board–approved protocol. As a measure of baseline performance, medical students (n = 11) and surgery residents (n = 39) completed 3 trials on each of 5 validated video-trainer tasks. Four board-certified surgeons established as laparoscopic experts (with more than 250 basic and more than 50 advanced cases) performed 11 trials on each of the 5 tasks. The mean score was determined and outliers (>2 SDs) were trimmed; the trimmed mean was used as the competency level. Baseline performance of each subject was compared with the competency level for each task.
All research was performed in a laparoscopic skills training and simulation laboratory.
Medical students, surgical residents, and board-certified surgeons.
Main Outcome Measures
Expert scores based on completion time and the number of subjects achieving these scores at baseline testing.
For all tasks combined, the competency level was reached by 6% of subjects by the third trial; 73% of these subjects were chief residents, and none were medical students.
These data suggest that the competency level is suitably challenging for novices but is achievable for subjects with more experience. Implementation of this performance criterion may allow trainees to reliably achieve maximal benefit while minimizing unnecessary training.