Validation of the blended learning usability evaluation – questionnaire (BLUE-Q) through an innovative Bayesian questionnaire validation approach: a methodological study

IF 9.3 Q1 EDUCATION, SCIENTIFIC DISCIPLINES Journal of Educational Evaluation for Health Professions Pub Date : 2024-01-01 Epub Date: 2024-11-07 DOI:10.3352/jeehp.2024.21.31
Anish Kumar Arora, Charo Rodriguez, Tamara Carver, Hao Zhang, Tibor Schuster
{"title":"Validation of the blended learning usability evaluation – questionnaire (BLUE-Q) through an innovative Bayesian questionnaire validation approach: a methodological study","authors":"Anish Kumar Arora, Charo Rodriguez, Tamara Carver, Hao Zhang, Tibor Schuster","doi":"10.3352/jeehp.2024.21.31","DOIUrl":null,"url":null,"abstract":"<p><strong>Purpose: </strong>The primary aim of this study is to validate the Blended Learning Usability Evaluation - Questionnaire (BLUE-Q) for use in the field of health professions education through a Bayesian approach. As Bayesian questionnaire validation remains elusive, a secondary aim of this article is to serve as a simplified tutorial for engaging in such validation practices in health professions education.</p><p><strong>Methods: </strong>A total of 10 health education-based experts in blended learning were recruited to participate in a 30-minute interviewer-administered survey. On a 5-point Likert scale, experts rated how well they perceived each item of the BLUE-Q to reflect its underlying usability domain (i.e., effectiveness, efficiency, satisfaction, accessibility, organization, and learner experience). Ratings were descriptively analyzed and converted into beta prior distributions. Participants were also given the option to provide qualitative comments for each item.</p><p><strong>Results: </strong>After reviewing the computed expert prior distributions, 31 quantitative items were identified as having a probability of 'low endorsement' and were thus removed from the questionnaire. Additionally, qualitative comments were used to revise the phrasing and order of items to ensure clarity and logical flow. The BLUE-Q's final version comprises 23 Likert-scale items and 6 open-ended items.</p><p><strong>Conclusion: </strong>Questionnaire validation can generally be a complex, time-consuming, and costly process, inhibiting many from engaging in proper validation practices. In this study, we demonstrate that a Bayesian questionnaire validation approach can be a simple, resource-efficient, yet rigorous solution to validating a tool for content and item-domain correlation through the elicitation of domain expert endorsement ratings.</p>","PeriodicalId":46098,"journal":{"name":"Journal of Educational Evaluation for Health Professions","volume":"21 ","pages":"31"},"PeriodicalIF":9.3000,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Educational Evaluation for Health Professions","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3352/jeehp.2024.21.31","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/11/7 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"EDUCATION, SCIENTIFIC DISCIPLINES","Score":null,"Total":0}
引用次数: 0

Abstract

Purpose: The primary aim of this study is to validate the Blended Learning Usability Evaluation - Questionnaire (BLUE-Q) for use in the field of health professions education through a Bayesian approach. As Bayesian questionnaire validation remains elusive, a secondary aim of this article is to serve as a simplified tutorial for engaging in such validation practices in health professions education.

Methods: A total of 10 health education-based experts in blended learning were recruited to participate in a 30-minute interviewer-administered survey. On a 5-point Likert scale, experts rated how well they perceived each item of the BLUE-Q to reflect its underlying usability domain (i.e., effectiveness, efficiency, satisfaction, accessibility, organization, and learner experience). Ratings were descriptively analyzed and converted into beta prior distributions. Participants were also given the option to provide qualitative comments for each item.

Results: After reviewing the computed expert prior distributions, 31 quantitative items were identified as having a probability of 'low endorsement' and were thus removed from the questionnaire. Additionally, qualitative comments were used to revise the phrasing and order of items to ensure clarity and logical flow. The BLUE-Q's final version comprises 23 Likert-scale items and 6 open-ended items.

Conclusion: Questionnaire validation can generally be a complex, time-consuming, and costly process, inhibiting many from engaging in proper validation practices. In this study, we demonstrate that a Bayesian questionnaire validation approach can be a simple, resource-efficient, yet rigorous solution to validating a tool for content and item-domain correlation through the elicitation of domain expert endorsement ratings.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
通过创新的贝叶斯问卷验证方法验证混合式学习可用性评估--问卷(BLUE-Q):一项方法学研究
目的:本研究的主要目的是通过贝叶斯方法验证混合式学习可用性评估--问卷(BLUE-Q)在卫生专业教育领域的应用。由于贝叶斯问卷验证仍然难以实现,本文的第二个目的是为在卫生专业教育领域开展此类验证实践提供一个简化教程:方法:共招募了 10 名基于健康教育的混合式学习专家,让他们参与 30 分钟的访谈调查。专家们用5点李克特量表对BLUE-Q的每个项目反映其基本可用性领域(即有效性、效率、满意度、可访问性、组织性和学习者体验)的程度进行评分。我们对评分进行了描述性分析,并将其转换为贝塔先验分布。参与者还可以对每个项目提供定性评论:结果:在检查了计算出的专家先验分布后,31 个定量项目被确定为 "低认可 "概率,因此被从问卷中删除。此外,还利用定性意见修改了项目的措辞和顺序,以确保清晰度和逻辑流畅性。BLUE-Q 的最终版本包括 23 个李克特量表项目和 6 个开放式项目:问卷验证通常是一个复杂、耗时且成本高昂的过程,阻碍了许多人参与适当的验证实践。在本研究中,我们证明了贝叶斯问卷验证方法是一种简单、节省资源且严谨的解决方案,可通过征求领域专家的认可评级来验证工具的内容和项目-领域相关性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
9.60
自引率
9.10%
发文量
32
审稿时长
5 weeks
期刊介绍: Journal of Educational Evaluation for Health Professions aims to provide readers the state-of-the art practical information on the educational evaluation for health professions so that to increase the quality of undergraduate, graduate, and continuing education. It is specialized in educational evaluation including adoption of measurement theory to medical health education, promotion of high stakes examination such as national licensing examinations, improvement of nationwide or international programs of education, computer-based testing, computerized adaptive testing, and medical health regulatory bodies. Its field comprises a variety of professions that address public medical health as following but not limited to: Care workers Dental hygienists Dental technicians Dentists Dietitians Emergency medical technicians Health educators Medical record technicians Medical technologists Midwives Nurses Nursing aides Occupational therapists Opticians Oriental medical doctors Oriental medicine dispensers Oriental pharmacists Pharmacists Physical therapists Physicians Prosthetists and Orthotists Radiological technologists Rehabilitation counselor Sanitary technicians Speech-language therapists.
期刊最新文献
The irtQ R package: a user-friendly tool for item response theory-based test data analysis and calibration. Insights into undergraduate medical student selection tools: a systematic review and meta-analysis. Importance, performance frequency, and predicted future importance of dietitians’ jobs by practicing dietitians in Korea: a survey study Presidential address 2024: the expansion of computer-based testing to numerous health professions licensing examinations in Korea, preparation of computer-based practical tests, and adoption of the medical metaverse. Development and validity evidence for the resident-led large group teaching assessment instrument in the United States: a methodological study.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1