{"title":"对本科医学生选拔工具的见解:系统回顾和荟萃分析。","authors":"Pin-Hsiang Huang,Arash Arianpoor,Silas Taylor,Jenzel Gonzales,Boaz Shulruf","doi":"10.3352/jeehp.2024.21.22","DOIUrl":null,"url":null,"abstract":"PURPOSE\r\nEvaluating medical school selection tools is vital for evidence-based student selection. With previous reviews revealing knowledge gaps, this meta-analysis offers insights into the effectiveness of these selection tools.\r\n\r\nMETHODS\r\nA systematic review and meta-analysis were conducted applying the following criteria: peer-reviewed articles available in English, published from 2010 and which include empirical data linking performance in selection tools with assessment and dropout outcomes of undergraduate entry medical programs. Systematic reviews, meta-analyses, general opinion pieces, or commentaries were excluded. Effect sizes (ESs) of the predictability of academic and clinical performance within and by the end of the medicine program were extracted, and the pooled ESs were presented.\r\n\r\nRESULTS\r\nSixty-seven out of 2,212 articles were included, which yielded 236 ESs. Previous academic achievement predicted medical program academic performance (Cohen's d=0.697 in early program; 0.619 in end of program) and clinical exams (0.545 in end of program). Within aptitude tests, verbal reasoning and quantitative reasoning predicted academic achievement in the early program and in the last years (0.704 & 0.643, respectively). Overall aptitude tests predicted academic achievement in both the early and last years (0.550 & 0.371, respectively). Neither panel interviews, multiple mini-interviews, nor situational judgement tests (SJT) yielded statistically significant pooled ES.\r\n\r\nCONCLUSION\r\nCurrent evidence suggests that learning outcomes are predicted by previous academic achievement and aptitude tests. The predictive value of SJT and topics such as selection algorithms, features of interview (e.g., content of the questions) and the way the interviewers' reports are used, warrant further research.","PeriodicalId":46098,"journal":{"name":"Journal of Educational Evaluation for Health Professions","volume":"44 1","pages":"22"},"PeriodicalIF":9.3000,"publicationDate":"2024-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Insights into undergraduate medical student selection tools: a systematic review and meta-analysis.\",\"authors\":\"Pin-Hsiang Huang,Arash Arianpoor,Silas Taylor,Jenzel Gonzales,Boaz Shulruf\",\"doi\":\"10.3352/jeehp.2024.21.22\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"PURPOSE\\r\\nEvaluating medical school selection tools is vital for evidence-based student selection. With previous reviews revealing knowledge gaps, this meta-analysis offers insights into the effectiveness of these selection tools.\\r\\n\\r\\nMETHODS\\r\\nA systematic review and meta-analysis were conducted applying the following criteria: peer-reviewed articles available in English, published from 2010 and which include empirical data linking performance in selection tools with assessment and dropout outcomes of undergraduate entry medical programs. Systematic reviews, meta-analyses, general opinion pieces, or commentaries were excluded. Effect sizes (ESs) of the predictability of academic and clinical performance within and by the end of the medicine program were extracted, and the pooled ESs were presented.\\r\\n\\r\\nRESULTS\\r\\nSixty-seven out of 2,212 articles were included, which yielded 236 ESs. Previous academic achievement predicted medical program academic performance (Cohen's d=0.697 in early program; 0.619 in end of program) and clinical exams (0.545 in end of program). Within aptitude tests, verbal reasoning and quantitative reasoning predicted academic achievement in the early program and in the last years (0.704 & 0.643, respectively). Overall aptitude tests predicted academic achievement in both the early and last years (0.550 & 0.371, respectively). Neither panel interviews, multiple mini-interviews, nor situational judgement tests (SJT) yielded statistically significant pooled ES.\\r\\n\\r\\nCONCLUSION\\r\\nCurrent evidence suggests that learning outcomes are predicted by previous academic achievement and aptitude tests. The predictive value of SJT and topics such as selection algorithms, features of interview (e.g., content of the questions) and the way the interviewers' reports are used, warrant further research.\",\"PeriodicalId\":46098,\"journal\":{\"name\":\"Journal of Educational Evaluation for Health Professions\",\"volume\":\"44 1\",\"pages\":\"22\"},\"PeriodicalIF\":9.3000,\"publicationDate\":\"2024-09-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Educational Evaluation for Health Professions\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3352/jeehp.2024.21.22\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"EDUCATION, SCIENTIFIC DISCIPLINES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Educational Evaluation for Health Professions","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3352/jeehp.2024.21.22","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION, SCIENTIFIC DISCIPLINES","Score":null,"Total":0}
Insights into undergraduate medical student selection tools: a systematic review and meta-analysis.
PURPOSE
Evaluating medical school selection tools is vital for evidence-based student selection. With previous reviews revealing knowledge gaps, this meta-analysis offers insights into the effectiveness of these selection tools.
METHODS
A systematic review and meta-analysis were conducted applying the following criteria: peer-reviewed articles available in English, published from 2010 and which include empirical data linking performance in selection tools with assessment and dropout outcomes of undergraduate entry medical programs. Systematic reviews, meta-analyses, general opinion pieces, or commentaries were excluded. Effect sizes (ESs) of the predictability of academic and clinical performance within and by the end of the medicine program were extracted, and the pooled ESs were presented.
RESULTS
Sixty-seven out of 2,212 articles were included, which yielded 236 ESs. Previous academic achievement predicted medical program academic performance (Cohen's d=0.697 in early program; 0.619 in end of program) and clinical exams (0.545 in end of program). Within aptitude tests, verbal reasoning and quantitative reasoning predicted academic achievement in the early program and in the last years (0.704 & 0.643, respectively). Overall aptitude tests predicted academic achievement in both the early and last years (0.550 & 0.371, respectively). Neither panel interviews, multiple mini-interviews, nor situational judgement tests (SJT) yielded statistically significant pooled ES.
CONCLUSION
Current evidence suggests that learning outcomes are predicted by previous academic achievement and aptitude tests. The predictive value of SJT and topics such as selection algorithms, features of interview (e.g., content of the questions) and the way the interviewers' reports are used, warrant further research.
期刊介绍:
Journal of Educational Evaluation for Health Professions aims to provide readers the state-of-the art practical information on the educational evaluation for health professions so that to increase the quality of undergraduate, graduate, and continuing education. It is specialized in educational evaluation including adoption of measurement theory to medical health education, promotion of high stakes examination such as national licensing examinations, improvement of nationwide or international programs of education, computer-based testing, computerized adaptive testing, and medical health regulatory bodies. Its field comprises a variety of professions that address public medical health as following but not limited to: Care workers Dental hygienists Dental technicians Dentists Dietitians Emergency medical technicians Health educators Medical record technicians Medical technologists Midwives Nurses Nursing aides Occupational therapists Opticians Oriental medical doctors Oriental medicine dispensers Oriental pharmacists Pharmacists Physical therapists Physicians Prosthetists and Orthotists Radiological technologists Rehabilitation counselor Sanitary technicians Speech-language therapists.