Xian Zhao MD, MEd, Aneka Khilnani MS, Debra L. Weiner MD, PhD, Katie A. Donnelly MD, MPH, Christina E. Lindgren MD, Jennifer Chapman MD, Pavan Zaveri MD, MEd, William Benjamin Prince MD, Rosemary Thomas-Mohtat MD
{"title":"开发和评估用于儿科急诊医学实习的新型知识评估工具","authors":"Xian Zhao MD, MEd, Aneka Khilnani MS, Debra L. Weiner MD, PhD, Katie A. Donnelly MD, MPH, Christina E. Lindgren MD, Jennifer Chapman MD, Pavan Zaveri MD, MEd, William Benjamin Prince MD, Rosemary Thomas-Mohtat MD","doi":"10.1002/aet2.10938","DOIUrl":null,"url":null,"abstract":"<div>\n \n \n <section>\n \n <h3> Objectives</h3>\n \n <p>This study seeks to determine validity evidence for a newly developed multiple-choice examination (MCE) tool to assess retention and application of medical knowledge of students enrolled in a pediatric emergency medicine (PEM) clerkship.</p>\n </section>\n \n <section>\n \n <h3> Methods</h3>\n \n <p>A team of PEM physicians created a 110-item MCE covering the range of clinical topics in PEM relevant for medical students. The researchers determined examination content using the report of Clerkship Directors in Emergency Medicine and PEM Interest Group of the Society for Academic Emergency Medicine (SAEM). The authors administered the MCE to fourth-year medical students at the end of their PEM rotation from May 2020 to April 2023 at four institutions and then analyzed the examination using four of Messick's five sources of validity evidence: content, response process, internal structure, and relation to other variables.</p>\n </section>\n \n <section>\n \n <h3> Results</h3>\n \n <p>A total of 158 students took the test. In academic year (AY)20–21, 47 students took the test and scored, on average, 81%. After revision of poor and indeterminate questions, the 111 medical students who took the revised version of the test in AY21–AY23 scored on average 77.3% with a standard deviation of 5.7% with a normal distribution in scores. The revised questions were rated as excellent (10.0%), good (26.4%), fair (34.5%), poor (24.5%), or indeterminate (4.5%) based on test item discrimination. There was a positive correlation between MCE scores and students' clinical evaluations but no correlation between MCE scores and scores that students received on their clinical notes or patient presentations during case conference.</p>\n </section>\n \n <section>\n \n <h3> Conclusions</h3>\n \n <p>This novel PEM clerkship examination is a reliable test of medical knowledge. Future directions involve evaluating consequences of the MCE and offering the test to medical students in a dedicated PEM rotation at the national level.</p>\n </section>\n </div>","PeriodicalId":37032,"journal":{"name":"AEM Education and Training","volume":"8 1","pages":""},"PeriodicalIF":1.7000,"publicationDate":"2024-02-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Development and evaluation of a novel knowledge assessment tool for pediatric emergency medicine clerkships\",\"authors\":\"Xian Zhao MD, MEd, Aneka Khilnani MS, Debra L. Weiner MD, PhD, Katie A. Donnelly MD, MPH, Christina E. Lindgren MD, Jennifer Chapman MD, Pavan Zaveri MD, MEd, William Benjamin Prince MD, Rosemary Thomas-Mohtat MD\",\"doi\":\"10.1002/aet2.10938\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div>\\n \\n \\n <section>\\n \\n <h3> Objectives</h3>\\n \\n <p>This study seeks to determine validity evidence for a newly developed multiple-choice examination (MCE) tool to assess retention and application of medical knowledge of students enrolled in a pediatric emergency medicine (PEM) clerkship.</p>\\n </section>\\n \\n <section>\\n \\n <h3> Methods</h3>\\n \\n <p>A team of PEM physicians created a 110-item MCE covering the range of clinical topics in PEM relevant for medical students. The researchers determined examination content using the report of Clerkship Directors in Emergency Medicine and PEM Interest Group of the Society for Academic Emergency Medicine (SAEM). The authors administered the MCE to fourth-year medical students at the end of their PEM rotation from May 2020 to April 2023 at four institutions and then analyzed the examination using four of Messick's five sources of validity evidence: content, response process, internal structure, and relation to other variables.</p>\\n </section>\\n \\n <section>\\n \\n <h3> Results</h3>\\n \\n <p>A total of 158 students took the test. In academic year (AY)20–21, 47 students took the test and scored, on average, 81%. After revision of poor and indeterminate questions, the 111 medical students who took the revised version of the test in AY21–AY23 scored on average 77.3% with a standard deviation of 5.7% with a normal distribution in scores. The revised questions were rated as excellent (10.0%), good (26.4%), fair (34.5%), poor (24.5%), or indeterminate (4.5%) based on test item discrimination. There was a positive correlation between MCE scores and students' clinical evaluations but no correlation between MCE scores and scores that students received on their clinical notes or patient presentations during case conference.</p>\\n </section>\\n \\n <section>\\n \\n <h3> Conclusions</h3>\\n \\n <p>This novel PEM clerkship examination is a reliable test of medical knowledge. Future directions involve evaluating consequences of the MCE and offering the test to medical students in a dedicated PEM rotation at the national level.</p>\\n </section>\\n </div>\",\"PeriodicalId\":37032,\"journal\":{\"name\":\"AEM Education and Training\",\"volume\":\"8 1\",\"pages\":\"\"},\"PeriodicalIF\":1.7000,\"publicationDate\":\"2024-02-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"AEM Education and Training\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1002/aet2.10938\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"EDUCATION, SCIENTIFIC DISCIPLINES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"AEM Education and Training","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/aet2.10938","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"EDUCATION, SCIENTIFIC DISCIPLINES","Score":null,"Total":0}
Development and evaluation of a novel knowledge assessment tool for pediatric emergency medicine clerkships
Objectives
This study seeks to determine validity evidence for a newly developed multiple-choice examination (MCE) tool to assess retention and application of medical knowledge of students enrolled in a pediatric emergency medicine (PEM) clerkship.
Methods
A team of PEM physicians created a 110-item MCE covering the range of clinical topics in PEM relevant for medical students. The researchers determined examination content using the report of Clerkship Directors in Emergency Medicine and PEM Interest Group of the Society for Academic Emergency Medicine (SAEM). The authors administered the MCE to fourth-year medical students at the end of their PEM rotation from May 2020 to April 2023 at four institutions and then analyzed the examination using four of Messick's five sources of validity evidence: content, response process, internal structure, and relation to other variables.
Results
A total of 158 students took the test. In academic year (AY)20–21, 47 students took the test and scored, on average, 81%. After revision of poor and indeterminate questions, the 111 medical students who took the revised version of the test in AY21–AY23 scored on average 77.3% with a standard deviation of 5.7% with a normal distribution in scores. The revised questions were rated as excellent (10.0%), good (26.4%), fair (34.5%), poor (24.5%), or indeterminate (4.5%) based on test item discrimination. There was a positive correlation between MCE scores and students' clinical evaluations but no correlation between MCE scores and scores that students received on their clinical notes or patient presentations during case conference.
Conclusions
This novel PEM clerkship examination is a reliable test of medical knowledge. Future directions involve evaluating consequences of the MCE and offering the test to medical students in a dedicated PEM rotation at the national level.