A Natural-Language-Processing-Based Procedure for Generating Distractors for Multiple-Choice Questions.

IF 2.2 3区 医学 Q2 HEALTH CARE SCIENCES & SERVICES Evaluation & the Health Professions Pub Date : 2022-12-01 DOI:10.1177/01632787211046981
Peter Baldwin, Janet Mee, Victoria Yaneva, Miguel Paniagua, Jean D'Angelo, Kimberly Swygert, Brian E Clauser
{"title":"A Natural-Language-Processing-Based Procedure for Generating Distractors for Multiple-Choice Questions.","authors":"Peter Baldwin,&nbsp;Janet Mee,&nbsp;Victoria Yaneva,&nbsp;Miguel Paniagua,&nbsp;Jean D'Angelo,&nbsp;Kimberly Swygert,&nbsp;Brian E Clauser","doi":"10.1177/01632787211046981","DOIUrl":null,"url":null,"abstract":"<p><p>One of the most challenging aspects of writing multiple-choice test questions is identifying plausible incorrect response options-i.e., distractors. To help with this task, a procedure is introduced that can mine existing item banks for potential distractors by considering the similarities between a new item's stem and answer and the stems and response options for items in the bank. This approach uses natural language processing to measure similarity and requires a substantial pool of items for constructing the generating model. The procedure is demonstrated with data from the United States Medical Licensing Examination (USMLE®). For about half the items in the study, at least one of the top three system-produced candidates matched a human-produced distractor exactly; and for about one quarter of the items, two of the top three candidates matched human-produced distractors. A study was conducted in which a sample of system-produced candidates were shown to 10 experienced item writers. Overall, participants thought about 81% of the candidates were on topic and 56% would help human item writers with the task of writing distractors.</p>","PeriodicalId":12315,"journal":{"name":"Evaluation & the Health Professions","volume":null,"pages":null},"PeriodicalIF":2.2000,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Evaluation & the Health Professions","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1177/01632787211046981","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"HEALTH CARE SCIENCES & SERVICES","Score":null,"Total":0}
引用次数: 0

Abstract

One of the most challenging aspects of writing multiple-choice test questions is identifying plausible incorrect response options-i.e., distractors. To help with this task, a procedure is introduced that can mine existing item banks for potential distractors by considering the similarities between a new item's stem and answer and the stems and response options for items in the bank. This approach uses natural language processing to measure similarity and requires a substantial pool of items for constructing the generating model. The procedure is demonstrated with data from the United States Medical Licensing Examination (USMLE®). For about half the items in the study, at least one of the top three system-produced candidates matched a human-produced distractor exactly; and for about one quarter of the items, two of the top three candidates matched human-produced distractors. A study was conducted in which a sample of system-produced candidates were shown to 10 experienced item writers. Overall, participants thought about 81% of the candidates were on topic and 56% would help human item writers with the task of writing distractors.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于自然语言处理的选择题干扰因素生成方法。
写选择题最具挑战性的一个方面是找出可能的错误答案。,干扰选项。为了帮助完成这项任务,引入了一个过程,通过考虑新项目的词干和答案与词干和回答选项之间的相似性,可以挖掘现有题库中潜在的干扰因素。这种方法使用自然语言处理来度量相似性,并且需要大量的项目来构建生成模型。该程序用美国医疗执照考试(USMLE®)的数据进行演示。对于研究中大约一半的项目,系统产生的前三名候选物中至少有一种与人类产生的干扰物完全匹配;在大约四分之一的项目中,前三名候选人中有两名与人类制造的干扰物相匹配。进行了一项研究,其中系统产生的候选人样本显示给10个经验丰富的项目作者。总的来说,参与者认为81%的候选人是在主题上,56%的人会帮助人类项目作者完成写干扰物的任务。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
5.30
自引率
0.00%
发文量
31
审稿时长
>12 weeks
期刊介绍: Evaluation & the Health Professions is a peer-reviewed, quarterly journal that provides health-related professionals with state-of-the-art methodological, measurement, and statistical tools for conceptualizing the etiology of health promotion and problems, and developing, implementing, and evaluating health programs, teaching and training services, and products that pertain to a myriad of health dimensions. This journal is a member of the Committee on Publication Ethics (COPE). Average time from submission to first decision: 31 days
期刊最新文献
The Use of Contribution Analysis in Evaluating Health Interventions: A Scoping Review. Impact of Multi-point Nursing Strategies Under a Clinical Problem-Solving Framework on Adverse Events Associated With Thyroid Nodule Resection. Real Patient Participation in Workplace-Based Assessment of Health Professional Trainees: A Scoping Review. The Validity and Reliability of the Turkish Version of Self-Perceived Barriers for Physical Activity Questionnaire. Factors Associated With Agreement Between Parent and Childhood Cancer Survivor Reports on Child's Health Related Quality of Life.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1