A meta-analysis on the predictive validity of English language proficiency assessments for college admissions

IF 2.2 1区 文学 0 LANGUAGE & LINGUISTICS Language Testing Pub Date : 2022-08-16 DOI:10.1177/02655322221112364
Samuel D. Ihlenfeldt, Joseph A. Rios
{"title":"A meta-analysis on the predictive validity of English language proficiency assessments for college admissions","authors":"Samuel D. Ihlenfeldt, Joseph A. Rios","doi":"10.1177/02655322221112364","DOIUrl":null,"url":null,"abstract":"For institutions where English is the primary language of instruction, English assessments for admissions such as the Test of English as a Foreign Language (TOEFL) and International English Language Testing System (IELTS) give admissions decision-makers a sense of a student’s skills in academic English. Despite this explicit purpose, these exams have also been used for the practice of predicting academic success. In this study, we meta-analytically synthesized 132 effect sizes from 32 studies containing validity evidence of academic English assessments to determine whether different assessments (a) predicted academic success (as measured by grade point average [GPA]) and (b) did so comparably. Overall, assessments had a weak positive correlation with academic achievement (r = .231, p < .001). Additionally, no significant differences were found in the predictive power of the IELTS and TOEFL exams. No moderators were significant, indicating that these findings held true across school type, school level, and publication type. Although significant, the overall correlation was low; thus, practitioners are cautioned from using standardized English-language proficiency test scores in isolation in lieu of a holistic application review during the admissions process.","PeriodicalId":17928,"journal":{"name":"Language Testing","volume":"40 1","pages":"276 - 299"},"PeriodicalIF":2.2000,"publicationDate":"2022-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Language Testing","FirstCategoryId":"98","ListUrlMain":"https://doi.org/10.1177/02655322221112364","RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"0","JCRName":"LANGUAGE & LINGUISTICS","Score":null,"Total":0}
引用次数: 2

Abstract

For institutions where English is the primary language of instruction, English assessments for admissions such as the Test of English as a Foreign Language (TOEFL) and International English Language Testing System (IELTS) give admissions decision-makers a sense of a student’s skills in academic English. Despite this explicit purpose, these exams have also been used for the practice of predicting academic success. In this study, we meta-analytically synthesized 132 effect sizes from 32 studies containing validity evidence of academic English assessments to determine whether different assessments (a) predicted academic success (as measured by grade point average [GPA]) and (b) did so comparably. Overall, assessments had a weak positive correlation with academic achievement (r = .231, p < .001). Additionally, no significant differences were found in the predictive power of the IELTS and TOEFL exams. No moderators were significant, indicating that these findings held true across school type, school level, and publication type. Although significant, the overall correlation was low; thus, practitioners are cautioned from using standardized English-language proficiency test scores in isolation in lieu of a holistic application review during the admissions process.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
大学入学英语语言能力评估预测效度的元分析
对于以英语为主要教学语言的院校,招生英语评估,如托福(TOEFL)和国际英语语言测试系统(IELTS),可以让招生决策者了解学生的学术英语技能。尽管有这种明确的目的,这些考试也被用于预测学业成功的实践。在本研究中,我们对包含学术英语评估效度证据的32项研究的132个效应量进行了meta分析,以确定不同的评估(a)是否预测了学业成功(以平均绩点[GPA]衡量)和(b)是否具有可比性。总体而言,评估与学业成绩呈弱正相关(r =。231, p < .001)。此外,雅思和托福考试的预测能力没有显著差异。没有显著的调节因子,表明这些发现在学校类型、学校水平和出版物类型中都是正确的。虽然显著,但总体相关性较低;因此,从业者被告诫不要在招生过程中孤立地使用标准化的英语水平考试成绩,而不是进行全面的申请审查。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Language Testing
Language Testing Multiple-
CiteScore
6.70
自引率
9.80%
发文量
35
期刊介绍: Language Testing is a fully peer reviewed international journal that publishes original research and review articles on language testing and assessment. It provides a forum for the exchange of ideas and information between people working in the fields of first and second language testing and assessment. This includes researchers and practitioners in EFL and ESL testing, and assessment in child language acquisition and language pathology. In addition, special attention is focused on issues of testing theory, experimental investigations, and the following up of practical implications.
期刊最新文献
Can language test providers do more to support open science? A response to Winke Considerations to promote and accelerate Open Science: A response to Winke Evaluating the impact of nonverbal behavior on language ability ratings Sharing, collaborating, and building trust: How Open Science advances language testing Open Science in language assessment research contexts: A reply to Winke
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1