Development of a competency-based formative progress test with student-generated MCQs: Results from a multi-centre pilot study.

GMS Zeitschrift fur Medizinische Ausbildung Pub Date : 2015-10-15 eCollection Date: 2015-01-01 DOI:10.3205/zma000988
Stefan Wagener, Andreas Möltner, Sevgi Tımbıl, Maryna Gornostayeva, Jobst-Hendrik Schultz, Peter Brüstle, Daniela Mohr, Anna Vander Beken, Julian Better, Martin Fries, Marc Gottschalk, Janine Günther, Laura Herrmann, Christian Kreisel, Tobias Moczko, Claudius Illg, Adam Jassowicz, Andreas Müller, Moritz Niesert, Felix Strübing, Jana Jünger
{"title":"Development of a competency-based formative progress test with student-generated MCQs: Results from a multi-centre pilot study.","authors":"Stefan Wagener, Andreas Möltner, Sevgi Tımbıl, Maryna Gornostayeva, Jobst-Hendrik Schultz, Peter Brüstle, Daniela Mohr, Anna Vander Beken, Julian Better, Martin Fries, Marc Gottschalk, Janine Günther, Laura Herrmann, Christian Kreisel, Tobias Moczko, Claudius Illg, Adam Jassowicz, Andreas Müller, Moritz Niesert, Felix Strübing, Jana Jünger","doi":"10.3205/zma000988","DOIUrl":null,"url":null,"abstract":"<p><strong>Introduction: </strong>Progress tests provide students feedback on their level of proficiency over the course of their medical studies. Peer-assisted learning and competency-based education have become increasingly important in medical education. Although progress tests have been proven to be useful as a longitudinal feedback instrument, there are currently no progress tests that have been created in cooperation with students or that focus on competency in medical education. In this study, we investigated the extent to which students can be included in the development of a progress test and demonstrated that aspects of knowledge related to competency can be represented on a competency-based progress test.</p><p><strong>Methods: </strong>A two-dimensional blueprint for 144 multiple-choice questions (MCQs) covering groups of medical subjects and groups of competency areas was generated by three expert groups for developing the competency-based progress test. A total of 31 students from seven medical schools in Germany actively participated in this exercise. After completing an intensive and comprehensive training programme, the students generated and reviewed the test questions for the competency-based progress test using a separate platform of the ItemManagementSystem (IMS). This test was administered as a formative test to 469 students in a pilot study in November 2013 at eight medical schools in Germany. The scores were analysed for the overall test and differentiated according to the subject groups and competency areas.</p><p><strong>Results: </strong>A pool of more than 200 MCQs was compiled by the students for pilot use, of which 118 student-generated MCQs were used in the progress test. University instructors supplemented this pool with 26 MCQs, which primarily addressed the area of scientific skills. The post-review showed that student-generated MCQs were of high quality with regard to test statistic criteria and content. Overall, the progress test displayed a very high reliability. When the academic years were compared, the progress test mapped out over the course of study not only by the overall test but also in terms of the subject groups and competency areas.</p><p><strong>Outlook: </strong>Further development in cooperation with students will be continued. Focus will be on compiling additional questions and test formats that can represent competency at a higher skill level, such as key feature questions, situational judgement test questions and OSCE. In addition, the feedback formats will be successively expanded. The intention is also to offer the formative competency-based progress test online.</p>","PeriodicalId":30054,"journal":{"name":"GMS Zeitschrift fur Medizinische Ausbildung","volume":"32 4","pages":"Doc46"},"PeriodicalIF":0.0000,"publicationDate":"2015-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4606478/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"GMS Zeitschrift fur Medizinische Ausbildung","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3205/zma000988","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2015/1/1 0:00:00","PubModel":"eCollection","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Introduction: Progress tests provide students feedback on their level of proficiency over the course of their medical studies. Peer-assisted learning and competency-based education have become increasingly important in medical education. Although progress tests have been proven to be useful as a longitudinal feedback instrument, there are currently no progress tests that have been created in cooperation with students or that focus on competency in medical education. In this study, we investigated the extent to which students can be included in the development of a progress test and demonstrated that aspects of knowledge related to competency can be represented on a competency-based progress test.

Methods: A two-dimensional blueprint for 144 multiple-choice questions (MCQs) covering groups of medical subjects and groups of competency areas was generated by three expert groups for developing the competency-based progress test. A total of 31 students from seven medical schools in Germany actively participated in this exercise. After completing an intensive and comprehensive training programme, the students generated and reviewed the test questions for the competency-based progress test using a separate platform of the ItemManagementSystem (IMS). This test was administered as a formative test to 469 students in a pilot study in November 2013 at eight medical schools in Germany. The scores were analysed for the overall test and differentiated according to the subject groups and competency areas.

Results: A pool of more than 200 MCQs was compiled by the students for pilot use, of which 118 student-generated MCQs were used in the progress test. University instructors supplemented this pool with 26 MCQs, which primarily addressed the area of scientific skills. The post-review showed that student-generated MCQs were of high quality with regard to test statistic criteria and content. Overall, the progress test displayed a very high reliability. When the academic years were compared, the progress test mapped out over the course of study not only by the overall test but also in terms of the subject groups and competency areas.

Outlook: Further development in cooperation with students will be continued. Focus will be on compiling additional questions and test formats that can represent competency at a higher skill level, such as key feature questions, situational judgement test questions and OSCE. In addition, the feedback formats will be successively expanded. The intention is also to offer the formative competency-based progress test online.

Abstract Image

Abstract Image

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
利用学生生成的 MCQ 开发基于能力的形成性进展测试:多中心试点研究的结果。
介绍:进度测试为学生提供反馈,了解他们在医学学习过程中的能力水平。同伴互助学习和能力本位教育在医学教育中变得越来越重要。虽然进度测验作为一种纵向反馈工具已被证明是有用的,但目前还没有与学生合作创建的进度测验,也没有侧重于医学教育能力的进度测验。在这项研究中,我们调查了学生参与进度测验开发的程度,并证明了与能力相关的知识可以在基于能力的进度测验中得到体现:方法:为开发基于能力的进步测试,三个专家组生成了一个包含 144 道选择题(MCQ)的二维蓝图,这些选择题涵盖了医学科目组和能力领域组。共有来自德国七所医学院的 31 名学生积极参与了这项工作。在完成密集的综合培训课程后,学生们利用项目管理系统(IMS)的独立平台生成并审核了能力进步测试的试题。2013 年 11 月,德国八所医学院对 469 名学生进行了试点研究,并将该测试作为形成性测试。对整个测试的分数进行了分析,并根据学科组和能力领域进行了区分:结果:学生编制了200多道MCQ供试点使用,其中118道学生编制的MCQ用于进度测试。大学教师在此基础上补充了 26 个 MCQ,主要涉及科学技能领域。事后审查结果表明,学生自制的 MCQ 在测试统计标准和内容方面质量较高。总体而言,进度测验显示出很高的可靠性。在对各学年的情况进行比较时,进度测试不仅在整体测试方面,而且在学科组和能力领域方面都与学习过程相吻合:展望:将继续与学生合作进行进一步的开发。重点将放在编制更多的试题和测试格式上,这些试题和格式可以代表更高技能水平的能力,如关键特征试题、情境判断试题和开放性开放考试(OSCE)。此外,还将陆续扩展反馈形式。还打算在网上提供基于形成性能力的进度测试。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
审稿时长
25 weeks
期刊最新文献
8th meeting of the medical assessment consortium UCAN: "Collaborative Perspectives for Competency-based and Quality-assured Medical Assessment". Influence of a revision course and the gender of examiners on the grades of the final ENT exam--a retrospective review of 3961 exams. The Final Oral/Practical State Examination at Freiburg Medical Faculty in 2012--Analysis of grading to test quality assurance. The new final Clinical Skills examination in human medicine in Switzerland: Essential steps of exam development, implementation and evaluation, and central insights from the perspective of the national Working Group. Electronic acquisition of OSCE performance using tablets.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1