{"title":"Program Evaluation in Competence by Design: A Mixed-Methods Study.","authors":"Jenna Milosek, Kaylee Eady, Katherine A Moreau","doi":"10.1177/23821205251321791","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>The evaluation of Competence by Design (CBD) residency programs is crucial for enhancing program effectiveness. However, literature on evaluating CBD programs is limited. We conducted a 2-phase mixed-methods study to (a) assess the extent of program evaluation activities in CBD residency programs in Canada, (b) explore reasons for engaging or not engaging in these activities, (c) examine how CBD programs are conducting program evaluations, and (d) identify ways to build capacity for program evaluation.</p><p><strong>Methods: </strong>Phase 1 involved surveying 149 program directors from specialty/subspecialty programs that transitioned to CBD between 2017 and 2020. We calculated descriptive statistics for 22 closed-ended survey items. Phase 2 comprised interviews with a subset of program directors from Phase 1. Data analysis followed a 3-step iterative process: data condensation, data display, and drawing and verifying conclusions.</p><p><strong>Results: </strong>In Phase 1, we received 149 responses, with a 33.5% response rate. Of these, 127 (85.2%) indicated their programs engage in evaluation, while 22 (14.8%) do not. Among the 127 programs that engage in evaluation, 29 (22.8%) frequently or always develop evaluation questions, and 23 (18.1%) design evaluation proposals/plans. Reasons for engaging in evaluation included decision-making and stimulating changes in educational practices. Conversely, reasons for not engaging included lack of knowledge, personnel, and funding. In Phase 2, 15 program directors were interviewed. They reported that CBD programs face challenges such as limited resources and buy-in, rely on ad hoc evaluation methods, and use a team-based evaluation format. To enhance evaluation capacities, interviewees suggested (a) developing expertise in program evaluation, (b) acquiring evaluation resources, and (c) advocating for clear evaluation expectations.</p><p><strong>Conclusions: </strong>Most CBD residency programs are engaged in program evaluations, but the quality is often questionable. To fully realize the potential of program evaluation, CBD programs need additional resources and support to improve evaluation practices and outcomes.</p>","PeriodicalId":45121,"journal":{"name":"Journal of Medical Education and Curricular Development","volume":"12 ","pages":"23821205251321791"},"PeriodicalIF":2.0000,"publicationDate":"2025-02-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11833823/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Medical Education and Curricular Development","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1177/23821205251321791","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/1/1 0:00:00","PubModel":"eCollection","JCR":"Q2","JCRName":"EDUCATION, SCIENTIFIC DISCIPLINES","Score":null,"Total":0}
引用次数: 0
Abstract
Background: The evaluation of Competence by Design (CBD) residency programs is crucial for enhancing program effectiveness. However, literature on evaluating CBD programs is limited. We conducted a 2-phase mixed-methods study to (a) assess the extent of program evaluation activities in CBD residency programs in Canada, (b) explore reasons for engaging or not engaging in these activities, (c) examine how CBD programs are conducting program evaluations, and (d) identify ways to build capacity for program evaluation.
Methods: Phase 1 involved surveying 149 program directors from specialty/subspecialty programs that transitioned to CBD between 2017 and 2020. We calculated descriptive statistics for 22 closed-ended survey items. Phase 2 comprised interviews with a subset of program directors from Phase 1. Data analysis followed a 3-step iterative process: data condensation, data display, and drawing and verifying conclusions.
Results: In Phase 1, we received 149 responses, with a 33.5% response rate. Of these, 127 (85.2%) indicated their programs engage in evaluation, while 22 (14.8%) do not. Among the 127 programs that engage in evaluation, 29 (22.8%) frequently or always develop evaluation questions, and 23 (18.1%) design evaluation proposals/plans. Reasons for engaging in evaluation included decision-making and stimulating changes in educational practices. Conversely, reasons for not engaging included lack of knowledge, personnel, and funding. In Phase 2, 15 program directors were interviewed. They reported that CBD programs face challenges such as limited resources and buy-in, rely on ad hoc evaluation methods, and use a team-based evaluation format. To enhance evaluation capacities, interviewees suggested (a) developing expertise in program evaluation, (b) acquiring evaluation resources, and (c) advocating for clear evaluation expectations.
Conclusions: Most CBD residency programs are engaged in program evaluations, but the quality is often questionable. To fully realize the potential of program evaluation, CBD programs need additional resources and support to improve evaluation practices and outcomes.