Exploring Endoscopic Competence in Gastroenterology Training: A Simulation-Based Comparative Analysis of GAGES, DOPS, and ACE Assessment Tools

IF 1.8 Q2 EDUCATION, SCIENTIFIC DISCIPLINES Advances in Medical Education and Practice Pub Date : 2024-01-31 DOI:10.2147/amep.s427076
Faisal Wasim Ismail, Azam Afzal, Rafia Durrani, Rayyan Qureshi, Safia Awan, Michelle R Brown
{"title":"Exploring Endoscopic Competence in Gastroenterology Training: A Simulation-Based Comparative Analysis of GAGES, DOPS, and ACE Assessment Tools","authors":"Faisal Wasim Ismail, Azam Afzal, Rafia Durrani, Rayyan Qureshi, Safia Awan, Michelle R Brown","doi":"10.2147/amep.s427076","DOIUrl":null,"url":null,"abstract":"<strong>Purpose:</strong> Accurate and convenient evaluation tools are essential to document endoscopic competence in Gastroenterology training programs. The Direct Observation of Procedural Skills (DOPS), Global Assessment of Gastrointestinal Endoscopic Skills (GAGES), and Assessment of Endoscopic Competency (ACE) are widely used validated competency assessment tools for gastrointestinal endoscopy. However, studies comparing these 3 tools are lacking, leading to lack of standardization in this assessment. Through simulation, this study seeks to determine the most reliable, comprehensive, and user-friendly tool for standardizing endoscopy competency assessment.<br/><strong>Methods:</strong> A mixed-methods quantitative-qualitative approach was utilized with sequential deductive design. All nine trainees in a gastroenterology training program were assessed on endoscopic procedural competence using the Simbionix Gi-bronch-mentor high-fidelity simulator, with 2 faculty raters independently completing the 3 assessment forms of DOPS, GAGES, and ACE. Psychometric analysis was used to evaluate the tools’ reliability. Additionally, faculty trainers participated in a focused group discussion (FGD) to investigate their experience in using the tools.<br/><strong>Results:</strong> For upper GI endoscopy, Cronbach’s alpha values for internal consistency were 0.53, 0.8, and 0.87 for ACE, DOPS, and GAGES, respectively. Inter-rater reliability (IRR) scores were 0.79 (0.43– 0.92) for ACE, 0.75 (− 0.13– 0.82) for DOPS, and 0.59 (− 0.90– 0.84) for GAGES. For colonoscopy, Cronbach’s alpha values for internal consistency were 0.53, 0.82, and 0.85 for ACE, DOPS, and GAGES, respectively. IRR scores were 0.72 (0.39– 0.96) for ACE, 0.78 (− 0.12– 0.86) for DOPS, and 0.53 (− 0.91– 0.78) for GAGES. The FGD yielded three key themes: the ideal tool should be scientifically sound, comprehensive, and user-friendly.<br/><strong>Conclusion:</strong> The DOPS tool performed favourably in both the qualitative assessment and psychometric evaluation to be considered the most balanced amongst the three assessment tools. We propose that the DOPS tool be used for endoscopic skill assessment in gastroenterology training programs. However, gastroenterology training programs need to match their learning outcomes with the available assessment tools to determine the most appropriate one in their context.<br/><br/><strong>Keywords:</strong> simulation, training, competence, endoscopy, gastroenterology<br/>","PeriodicalId":47404,"journal":{"name":"Advances in Medical Education and Practice","volume":null,"pages":null},"PeriodicalIF":1.8000,"publicationDate":"2024-01-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advances in Medical Education and Practice","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2147/amep.s427076","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"EDUCATION, SCIENTIFIC DISCIPLINES","Score":null,"Total":0}
引用次数: 0

Abstract

Purpose: Accurate and convenient evaluation tools are essential to document endoscopic competence in Gastroenterology training programs. The Direct Observation of Procedural Skills (DOPS), Global Assessment of Gastrointestinal Endoscopic Skills (GAGES), and Assessment of Endoscopic Competency (ACE) are widely used validated competency assessment tools for gastrointestinal endoscopy. However, studies comparing these 3 tools are lacking, leading to lack of standardization in this assessment. Through simulation, this study seeks to determine the most reliable, comprehensive, and user-friendly tool for standardizing endoscopy competency assessment.
Methods: A mixed-methods quantitative-qualitative approach was utilized with sequential deductive design. All nine trainees in a gastroenterology training program were assessed on endoscopic procedural competence using the Simbionix Gi-bronch-mentor high-fidelity simulator, with 2 faculty raters independently completing the 3 assessment forms of DOPS, GAGES, and ACE. Psychometric analysis was used to evaluate the tools’ reliability. Additionally, faculty trainers participated in a focused group discussion (FGD) to investigate their experience in using the tools.
Results: For upper GI endoscopy, Cronbach’s alpha values for internal consistency were 0.53, 0.8, and 0.87 for ACE, DOPS, and GAGES, respectively. Inter-rater reliability (IRR) scores were 0.79 (0.43– 0.92) for ACE, 0.75 (− 0.13– 0.82) for DOPS, and 0.59 (− 0.90– 0.84) for GAGES. For colonoscopy, Cronbach’s alpha values for internal consistency were 0.53, 0.82, and 0.85 for ACE, DOPS, and GAGES, respectively. IRR scores were 0.72 (0.39– 0.96) for ACE, 0.78 (− 0.12– 0.86) for DOPS, and 0.53 (− 0.91– 0.78) for GAGES. The FGD yielded three key themes: the ideal tool should be scientifically sound, comprehensive, and user-friendly.
Conclusion: The DOPS tool performed favourably in both the qualitative assessment and psychometric evaluation to be considered the most balanced amongst the three assessment tools. We propose that the DOPS tool be used for endoscopic skill assessment in gastroenterology training programs. However, gastroenterology training programs need to match their learning outcomes with the available assessment tools to determine the most appropriate one in their context.

Keywords: simulation, training, competence, endoscopy, gastroenterology
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
探索消化内科培训中的内镜能力:基于模拟的 GAGES、DOPS 和 ACE 评估工具比较分析
目的:准确便捷的评估工具对于记录消化内科培训项目中的内镜能力至关重要。程序技能直接观察法(DOPS)、消化内镜技能全球评估法(GAGES)和内镜能力评估法(ACE)是广泛使用的有效消化内镜能力评估工具。然而,缺乏对这三种工具进行比较的研究,导致评估缺乏标准化。本研究旨在通过模拟,确定最可靠、最全面、最方便用户使用的内镜能力标准化评估工具:方法: 采用定量-定性混合方法和顺序演绎设计。使用Simbionix Gi-bronch-mentor高保真模拟器对消化内科培训项目中的所有9名学员进行内镜操作能力评估,由2名教师评分员独立完成DOPS、GAGES和ACE三种评估表格。心理测量分析用于评估工具的可靠性。此外,教员培训者还参加了一个焦点小组讨论(FGD),以调查他们使用这些工具的经验:在上消化道内窥镜检查中,ACE、DOPS 和 GAGES 的内部一致性 Cronbach's alpha 值分别为 0.53、0.8 和 0.87。ACE、DOPS 和 GAGES 的内部一致性 Cronbach alpha 值分别为 0.53、0.8 和 0.87,评分者之间的可靠性 (IRR) 分别为 0.79(0.43-0.92)、0.75(- 0.13-0.82)和 0.59(- 0.90-0.84)。对于结肠镜检查,ACE、DOPS 和 GAGES 的内部一致性 Cronbach's alpha 值分别为 0.53、0.82 和 0.85。ACE 的 IRR 得分为 0.72(0.39- 0.96),DOPS 为 0.78(- 0.12-0.86),GAGES 为 0.53(- 0.91-0.78)。FGD 得出了三个关键主题:理想的工具应具有科学性、全面性和用户友好性:DOPS 工具在定性评估和心理测量评估中均表现良好,被认为是三种评估工具中最均衡的工具。我们建议在消化内科培训项目中使用 DOPS 工具进行内窥镜技能评估。然而,消化内科培训项目需要将其学习成果与可用的评估工具相匹配,以确定最适合其情况的评估工具。 关键词:模拟;培训;能力;内窥镜;消化内科
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Advances in Medical Education and Practice
Advances in Medical Education and Practice EDUCATION, SCIENTIFIC DISCIPLINES-
CiteScore
3.10
自引率
10.00%
发文量
189
审稿时长
16 weeks
期刊最新文献
Developing and Validating Entrustable Professional Activities (EPAs) for Rheumatology Fellowship Training Programs in Saudi Arabia: A Delphi Study Students’ Perception of Peer- Students Mentoring Program “Big Sibling Mentoring Program” to Complement Faculty Mentoring of First-Year Medical Students in Saudi Arabia Incorporating Technology Adoption in Medical Education: A Qualitative Study of Medical Students’ Perspectives [Letter] Development, Implementation, and Assessment of an Online Modular Telehealth Curriculum for Health Professions Students [Letter] Competency of Nurses on Electrocardiogram Monitoring and Interpretation in Selected Hospitals of Al-Ahsa, Saudi Arabia
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1