If you don't remember your password, you can reset it by entering your email address and clicking the Reset Password button. You will then receive an email that contains a secure link for resetting your password
If the address matches a valid account an email will be sent to __email__ with instructions for resetting your password
In this prospective cohort pilot study, our objective was to validate the use of a video-based Objective Structured Assessment of Technical Skills (OSATS)–style rubric for grading corneal suturing performance. The OSATS rubric is a validated checklist used for the evaluation of technical skills.
Institutional ethics approval was obtained from the Health Sciences and Affiliated Teaching Hospitals Research Ethics Board at Queen's University (ID no. 6030891). Ophthalmology residents from all postgraduate years (1–5) were recruited, each with varying levels of surgical experience. Participants completed a surgical skill test consisting of a corneal suturing task.
Each resident was evaluated over 2 separate attempts that were 3 weeks apart. Assessment at 2 time points allowed for demonstration of surgical skill improvement using this evaluation technique to increase our total sample size. The sessions were scheduled to ensure that there were no additional surgical simulation skills practice or teaching in between. Participants watched a brief instructional video (Appendix A, available online) outlining the task and assessment criteria prior to task execution. Residents were provided a precut centred penetrating keratoplasty graft from a cadaver eye and were instructed to place 4 cardinal sutures. All corneal tissue samples were prepared by the same research team member (R.C.) to ensure consistency. Sessions were recorded in the surgical simulation wet lab using an Echo360 device (Echo360, Reston, Va.) connected to a surgical microscope. Each anonymized video was cut to eliminate dead space, randomly assigned a label, and submitted to each grader for scoring. The videos were independently and blindly marked by 2 faculty ophthalmologists, a cornea specialist and a noncornea specialist, using the OSATS-style rubric. Graders compared rating calibration after the first 5 videos to identify any possible differences in rubric usage; none were identified. IBM SPSS Statistics 27.0 (IBM, Armonk, NY) was used to analyze the OSATS scores using paired t tests that were validated with the Wilcoxon signed-rank test for statistical comparisons within the group, whereas interrater reliability for the OSATS-style rubric was assessed using intraclass correlations (ICCs).
Eight ophthalmology residents completed the corneal suturing task twice, yielding 16 video sessions. The overall ICC for the 2 independent graders was 0.916 (95% CI, 0.701–0.979; p < 0.001) for the first session and 0.870 (95% CI, 0.242–0.952; p = 0.006) for the second session. An ICC from 0.75 to 0.9 is considered good, whereas an ICC >0.9 is considered excellent.
Notably, all the mean surgical scores increased from the first to second session (Fig. 1). After the second session, respect for tissue score improved by 1.00 ± 2.00 (p = 0.200), time and motion score improved by 0.75 ± 1.2 (p = 0.119), instrument handling score improved by 0.87 ± 1.22 (p = 0.082), flow of operation and forward planning scored improved by 0.44 ± 1.44 (p = 0.500), and total surgical score improved by 3.06 ± 5.91 (p = 0.186).
In conclusion, we achieved our primary objective of showing an adapted OSATS rubric for video-based assessment as a reliable tool for evaluating ophthalmology resident corneal suturing skills with excellent interrater reliability. Interestingly, we also showed that even over 2 time points, surgical skills improvement can be captured using video rubric grading. Modified rubric grading of video-based simulation surgery may be applied to other microsurgical skills with the advantage of remote evaluation and multiple expert evaluators.
Footnotes and Disclosure
The authors have no proprietary or commercial interest in any materials discussed in this article.