Mohamed H Taha, Hosam Eldeen Elsadig Gasmalla Mohammed, Mohamed Elhassan Abdalla, Muhamad Saiful Bahri Yusoff, Mohd Kamal Mohd Napiah, Majed M Wadi
{"title":"卫生专业教育中扩展匹配问题(EMQs)有效性证据的报告和展示模式:系统综述。","authors":"Mohamed H Taha, Hosam Eldeen Elsadig Gasmalla Mohammed, Mohamed Elhassan Abdalla, Muhamad Saiful Bahri Yusoff, Mohd Kamal Mohd Napiah, Majed M Wadi","doi":"10.1080/10872981.2024.2412392","DOIUrl":null,"url":null,"abstract":"<p><p>The Extended matching Questions (EMQs), or R-type questions, are format of selected-response. The validity evidence for this format is crucial, but there have been reports of misunderstandings about validity. It is unclear what kinds of evidence should be presented and how to present them to support their educational impact. This review explores the pattern and quality of reporting the sources of validity evidence of EMQs in health professions education, encompassing content, response process, internal structure, relationship to other variables, and consequences. A systematic search in the electronic databases including MEDLINE via PubMed, Scopus, Web of Science, CINAHL, and ERIC was conducted to extract studies that utilize EMQs. The framework for a unitary concept of validity was applied to extract data. A total of 218 titles were initially selected, the final number of titles was 19. The most reported pieces of evidence were the reliability coefficient, followed by the relationship to another variable. Additionally, the adopted definition of validity is mostly the old tripartite concept. This study found that reporting and presenting validity evidence appeared to be deficient. The available evidence can hardly provide a strong validity argument that supports the educational impact of EMQs. This review calls for more work on developing a tool to measure the reporting and presenting validity evidence.</p>","PeriodicalId":47656,"journal":{"name":"Medical Education Online","volume":"29 1","pages":"2412392"},"PeriodicalIF":3.1000,"publicationDate":"2024-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11504699/pdf/","citationCount":"0","resultStr":"{\"title\":\"The pattern of reporting and presenting validity evidence of extended matching questions (EMQs) in health professions education: a systematic review.\",\"authors\":\"Mohamed H Taha, Hosam Eldeen Elsadig Gasmalla Mohammed, Mohamed Elhassan Abdalla, Muhamad Saiful Bahri Yusoff, Mohd Kamal Mohd Napiah, Majed M Wadi\",\"doi\":\"10.1080/10872981.2024.2412392\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>The Extended matching Questions (EMQs), or R-type questions, are format of selected-response. The validity evidence for this format is crucial, but there have been reports of misunderstandings about validity. It is unclear what kinds of evidence should be presented and how to present them to support their educational impact. This review explores the pattern and quality of reporting the sources of validity evidence of EMQs in health professions education, encompassing content, response process, internal structure, relationship to other variables, and consequences. A systematic search in the electronic databases including MEDLINE via PubMed, Scopus, Web of Science, CINAHL, and ERIC was conducted to extract studies that utilize EMQs. The framework for a unitary concept of validity was applied to extract data. A total of 218 titles were initially selected, the final number of titles was 19. The most reported pieces of evidence were the reliability coefficient, followed by the relationship to another variable. Additionally, the adopted definition of validity is mostly the old tripartite concept. This study found that reporting and presenting validity evidence appeared to be deficient. The available evidence can hardly provide a strong validity argument that supports the educational impact of EMQs. This review calls for more work on developing a tool to measure the reporting and presenting validity evidence.</p>\",\"PeriodicalId\":47656,\"journal\":{\"name\":\"Medical Education Online\",\"volume\":\"29 1\",\"pages\":\"2412392\"},\"PeriodicalIF\":3.1000,\"publicationDate\":\"2024-12-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11504699/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Medical Education Online\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.1080/10872981.2024.2412392\",\"RegionNum\":2,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/10/24 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q1\",\"JCRName\":\"EDUCATION & EDUCATIONAL RESEARCH\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Medical Education Online","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1080/10872981.2024.2412392","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/10/24 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0
摘要
扩展匹配问题(EMQs)或 R 型问题是一种选择回答形式。这种形式的有效性证据至关重要,但有报告称对有效性存在误解。目前尚不清楚应提供何种证据以及如何提供证据以支持其教育影响。本综述探讨了卫生专业教育中 EMQs 有效性证据来源的报告模式和质量,包括内容、反应过程、内部结构、与其他变量的关系以及后果。通过PubMed、Scopus、Web of Science、CINAHL和ERIC等电子数据库对MEDLINE进行了系统检索,以提取使用EMQs的研究。在提取数据时采用了有效性统一概念框架。最初共选取了 218 个标题,最终标题数量为 19 个。报告最多的证据是可靠性系数,其次是与其他变量的关系。此外,所采用的有效性定义大多是旧的三方概念。本研究发现,有效性证据的报告和提交似乎存在不足。现有的证据很难提供有力的效度论据来支持教育管理质量的教育影响。这篇综述呼吁开展更多工作,开发衡量报告和展示有效性证据的工具。
The pattern of reporting and presenting validity evidence of extended matching questions (EMQs) in health professions education: a systematic review.
The Extended matching Questions (EMQs), or R-type questions, are format of selected-response. The validity evidence for this format is crucial, but there have been reports of misunderstandings about validity. It is unclear what kinds of evidence should be presented and how to present them to support their educational impact. This review explores the pattern and quality of reporting the sources of validity evidence of EMQs in health professions education, encompassing content, response process, internal structure, relationship to other variables, and consequences. A systematic search in the electronic databases including MEDLINE via PubMed, Scopus, Web of Science, CINAHL, and ERIC was conducted to extract studies that utilize EMQs. The framework for a unitary concept of validity was applied to extract data. A total of 218 titles were initially selected, the final number of titles was 19. The most reported pieces of evidence were the reliability coefficient, followed by the relationship to another variable. Additionally, the adopted definition of validity is mostly the old tripartite concept. This study found that reporting and presenting validity evidence appeared to be deficient. The available evidence can hardly provide a strong validity argument that supports the educational impact of EMQs. This review calls for more work on developing a tool to measure the reporting and presenting validity evidence.
期刊介绍:
Medical Education Online is an open access journal of health care education, publishing peer-reviewed research, perspectives, reviews, and early documentation of new ideas and trends.
Medical Education Online aims to disseminate information on the education and training of physicians and other health care professionals. Manuscripts may address any aspect of health care education and training, including, but not limited to:
-Basic science education
-Clinical science education
-Residency education
-Learning theory
-Problem-based learning (PBL)
-Curriculum development
-Research design and statistics
-Measurement and evaluation
-Faculty development
-Informatics/web