Advertisement

Evaluating the effect of assessment form design on the quality of feedback in one Canadian ophthalmology residency program as an early adopter of CBME

Published:January 23, 2023DOI:https://doi.org/10.1016/j.jcjo.2023.01.003
      The initial findings of this research study were presented in oral presentation format at the Annual Meeting of the Canadian Ophthalmological Society in Halifax, Nova Scotia, June 9–12, 2022.
      At the core of competency-based medical education (CBME) is the philosophy of frequent assessment that is robust, continuous, and of excellent quality.
      • Holmboe ES
      • Sherbino J
      • Long DM
      • Swing SR
      • Frank JR
      International CBME Collaborators. The role of assessment in competency-based medical education.
      Ideally, narrative comments via electronic forms help to facilitate both a written record of the resident's performance at a specific task and coaching to encourage improvement.
      • Holmboe ES
      • Sherbino J
      • Long DM
      • Swing SR
      • Frank JR
      International CBME Collaborators. The role of assessment in competency-based medical education.
      • Lockyer J
      • Carraccio C
      • Chan MK
      • et al.
      Core principles of assessment in competency-based medical education.
      This study explored whether the inherent structure of the assessment form used had an effect on the quality of written feedback as measured using the Quality of Assessment of Learning (QuAL) score.
      • Chan TM
      • Sebok-Syer SS
      • Sampson C
      • Monteiro S.
      The Quality of Assessment of Learning (QuAL) score: validity evidence for a scoring system aimed at rating short, workplace-based comments on trainee performance.
      Both a Canadian Ophthalmology Assessment Tool for Surgery (COATS) procedural assessment form (an ophthalmology-specific adaptation of the Ottawa Surgical Competency Operating Room Evaluation [O-SCORE; Hurley B, O'Connor M, University of Ottawa Department of Ophthalmology, personal communication, August 2018])
      • Gofton WT
      • Dudek NL
      • Wood TJ
      • Balaa F
      • Hamstra SJ.
      The Ottawa surgical competency operating room evaluation (O-SCORE): a tool to assess surgical competence.
      and the Entrustable Professional Activity (EPA) form are routinely used at our centre but prompt feedback using different wording. The development of EPAs used for each stage of training and the forms used to capture this procedural and clinical data were created by the Department of Ophthalmology program director and an educational consultant based on Royal College Objectives of Training and the CanMeds Milestones Guide; input from faculty members, residents, and other shareholders is constantly integrated into this adaptive and dynamic assessment system.
      • Braund H
      • Dalgarno N
      • McEwen L
      • Egan R
      • Reid MA
      • Baxter S.
      Involving ophthalmology departmental stakeholders in developing workplace-based assessment tools.
      Following institutional ethics approval (TRAQ 6029081), all available assessment data was retrieved from Elentra, Queen's University's integrated teaching and learning platform, from July 2017 to December 2020 and organized by form type (Table 1), anonymized, and coded with a unique identifier for evaluator and target resident. Written feedback was assigned a QuAL score out of 5 based on the previously validated rubric (Table 2). All individual assessments were blindly scored by an ophthalmology faculty member, and a randomized sample of 10% was independently rescored by an ophthalmology resident to ensure inter-rater reliability.
      Table 1Assessment forms included in the analysis
      Form typeCBME stageForm numberFeedback prompts
      EPATransition to disciplineD1, 2, 3, 4.1,
      Two versions of D4 were included, with 4.2 being a revised version of 4.1. “Form number” refers to the specific form used to evaluate a single EPA within a CBME stage. For example, C1 evaluates a resident's ability to “Assess and diagnose subspecialty patients with clinical presentations expected in general practice.”
      4.2,
      Two versions of D4 were included, with 4.2 being a revised version of 4.1. “Form number” refers to the specific form used to evaluate a single EPA within a CBME stage. For example, C1 evaluates a resident's ability to “Assess and diagnose subspecialty patients with clinical presentations expected in general practice.”
      5, 6, 7, 8, 9
      “Next steps”

      “Global feedback”
      Fundamentals of disciplineF1, 2, 3, 4, 5, 6, 7, 8, 10, 11
      Core of disciplineC1, 2, 3, 4, 5, 6, 7, 8, 9, 10
      COATSAll stagesN/A“Give at least 1 specific aspect of the procedure done well”

      “Give at least 1 specific suggestion for improvement”

      “Global feedback”
      COATS, Canadian Ophthalmology Assessment Tool for Surgery; EPA, Entrustable Professional Activity
      low asterisk Two versions of D4 were included, with 4.2 being a revised version of 4.1. “Form number” refers to the specific form used to evaluate a single EPA within a CBME stage. For example, C1 evaluates a resident's ability to “Assess and diagnose subspecialty patients with clinical presentations expected in general practice.”
      Table 2Quality of Assessment of Learning (QuAL) score components as adapted from Chan et al. 2020
      • Chan TM
      • Sebok-Syer SS
      • Sampson C
      • Monteiro S.
      The Quality of Assessment of Learning (QuAL) score: validity evidence for a scoring system aimed at rating short, workplace-based comments on trainee performance.
      EvidenceDoes the rater provide sufficient evidence about resident performance?
      • 1-
        No comment at all
      • 2-
        No, but comment present
      • 3-
        Somewhat
      • 4-
        Yes/full description
      SuggestionDoes the rater provide a suggestion for improvement?
      • 1-
        No
      • 2-
        Yes
      ConnectionIs the rater's suggestion linked to the behaviour described?
      • 1-
        No
      • 2-
        Yes
      In total, 2617 individual assessments were graded from 20 different residents spanning postgraduate training years 1–5. The intraclass correlation coefficient (ICC) for the 2 independent graders was excellent at 0.90 (95% CI, 0.88–0.92; p < 0.001) for the total QuAL scores
      • Koo TK
      • Li MY.
      A guideline of selecting and reporting intraclass correlation coefficients for reliability research.
      COATS forms (n = 483) demonstrated a significantly higher total QuAL score (out of 5) versus EPA forms (n = 2134), with mean values of 3.48 versus 2.26 (p < 0.001). COATS forms also demonstrated a significantly higher “evidence of performance” score (out of 3) versus EPA forms, with mean values of 2.26 versus 1.60 (p < 0.001). Full marks were given (5 of 5 score) to 49.7% (n = 240) of COATS forms versus 16.5% (n = 353) of EPA forms (p < 0.001).
      EPA forms were designed for all stages of training to address key competencies in ophthalmology across a broad range of observed activities. However, instead of the EPA form that includes fillable text boxes with the prompts “Next steps” and “Global assessment,” COATS forms were used to assess only procedural competencies.
      • Gofton WT
      • Dudek NL
      • Wood TJ
      • Balaa F
      • Hamstra SJ.
      The Ottawa surgical competency operating room evaluation (O-SCORE): a tool to assess surgical competence.
      These forms include more specific prompts, asking the evaluator to identify 1 aspect of the procedure done well in addition to an element that could be improved, similar to the “stop, start, continue” method.
      • Hoon A
      • Oliver E
      • Szpakowska K
      • Newton P.
      Use of the “stop, start, continue” method is associated with the production of constructive qualitative feedback by students in higher education.
      As such, COATS forms are inherently more targeted and mesh into the existing scaffold of the QuAL score, discouraging general comments such as “Great work.” Intuitively, we hypothesized that they would yield higher QuAL scores. In fact, the mean value of the COATS forms total score was 3.48 versus 2.26 for EPA forms (p < 0.001). Perhaps more illustrative of the benefit of this type of form is that full marks were given (5 of 5 score) to significantly more COATS forms versus EPA forms. We suggest that these findings are likely due to the clear direction for the text fields in these forms versus the more nebulous “Next steps” in the EPA forms. However, because the COATS forms were used exclusively for procedures, it is also possible that procedural feedback is easier to provide and naturally lends itself better to constructive narrative feedback.
      Through use of the QuAL score in evaluating narrative feedback, this study demonstrates that procedural COATS forms with structured feedback prompts may be more effective in guiding evaluators to deliver comments on resident performance.

      Footnote and Disclosure

      The authors have no proprietary or commercial interest in any materials discussed in this article.

      References

        • Holmboe ES
        • Sherbino J
        • Long DM
        • Swing SR
        • Frank JR
        International CBME Collaborators. The role of assessment in competency-based medical education.
        Med Teach. 2010; 32: 676-682
        • Lockyer J
        • Carraccio C
        • Chan MK
        • et al.
        Core principles of assessment in competency-based medical education.
        Med Teach. 2017; 39: 609-616
        • Chan TM
        • Sebok-Syer SS
        • Sampson C
        • Monteiro S.
        The Quality of Assessment of Learning (QuAL) score: validity evidence for a scoring system aimed at rating short, workplace-based comments on trainee performance.
        Teach Learn Med. 2020; 32: 319-329
        • Gofton WT
        • Dudek NL
        • Wood TJ
        • Balaa F
        • Hamstra SJ.
        The Ottawa surgical competency operating room evaluation (O-SCORE): a tool to assess surgical competence.
        Acad Med. 2012; 87: 1401-1407
        • Braund H
        • Dalgarno N
        • McEwen L
        • Egan R
        • Reid MA
        • Baxter S.
        Involving ophthalmology departmental stakeholders in developing workplace-based assessment tools.
        Can J Ophthalmol. 2019; 54: 590-600
        • Koo TK
        • Li MY.
        A guideline of selecting and reporting intraclass correlation coefficients for reliability research.
        J Chiropract Med. 2016; 15: 155-163
        • Hoon A
        • Oliver E
        • Szpakowska K
        • Newton P.
        Use of the “stop, start, continue” method is associated with the production of constructive qualitative feedback by students in higher education.
        Assess Eval Higher Ed. 2015; 40: 755-767