聚合级测试量表链接的验证方法:一个反驳

IF 1.9 3区 心理学 Q2 EDUCATION & EDUCATIONAL RESEARCH Journal of Educational and Behavioral Statistics Pub Date : 2021-04-01 DOI:10.3102/1076998621994540
Andrew D. Ho, Sean F. Reardon, Demetra Kalogrides
{"title":"聚合级测试量表链接的验证方法:一个反驳","authors":"Andrew D. Ho, Sean F. Reardon, Demetra Kalogrides","doi":"10.3102/1076998621994540","DOIUrl":null,"url":null,"abstract":"In this issue, Reardon, Kalogrides, and Ho developed precision-adjusted random effects models to estimate aggregate-level linking error, for populations and subpopulations, for averages and progress over time. We are grateful to past editor Dan McCaffrey for selecting our paper as the focal article for a set of commentaries from our colleagues Daniel Bolt, Mark Davison, Alina von Davier, Tim Moses, and Neil Dorans. These commentaries reinforce important cautions and identify promising directions for future research. In this rejoinder, we clarify aspects of our originally proposed method. (1) Validation methods provide evidence of benefits and risks that different experts may weigh differently for different purposes. (2) Our proposed method differs from “standard mapping” procedures using the National Assessment of Educational Progress not only by using a linear (vs. equipercentile) link but also by targeting direct validity evidence about counterfactual aggregate scores. (3) Multilevel approaches that assume common score scales across states are indeed a promising next step for validation, and we hope that states enable researchers to use more of their common-core-era consortium test data for this purpose. Finally, we apply our linking method to an extended panel of data from 2009 to 2017 to show that linking recovery has remained stable.","PeriodicalId":48001,"journal":{"name":"Journal of Educational and Behavioral Statistics","volume":"46 1","pages":"209 - 218"},"PeriodicalIF":1.9000,"publicationDate":"2021-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Validation Methods for Aggregate-Level Test Scale Linking: A Rejoinder\",\"authors\":\"Andrew D. Ho, Sean F. Reardon, Demetra Kalogrides\",\"doi\":\"10.3102/1076998621994540\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this issue, Reardon, Kalogrides, and Ho developed precision-adjusted random effects models to estimate aggregate-level linking error, for populations and subpopulations, for averages and progress over time. We are grateful to past editor Dan McCaffrey for selecting our paper as the focal article for a set of commentaries from our colleagues Daniel Bolt, Mark Davison, Alina von Davier, Tim Moses, and Neil Dorans. These commentaries reinforce important cautions and identify promising directions for future research. In this rejoinder, we clarify aspects of our originally proposed method. (1) Validation methods provide evidence of benefits and risks that different experts may weigh differently for different purposes. (2) Our proposed method differs from “standard mapping” procedures using the National Assessment of Educational Progress not only by using a linear (vs. equipercentile) link but also by targeting direct validity evidence about counterfactual aggregate scores. (3) Multilevel approaches that assume common score scales across states are indeed a promising next step for validation, and we hope that states enable researchers to use more of their common-core-era consortium test data for this purpose. Finally, we apply our linking method to an extended panel of data from 2009 to 2017 to show that linking recovery has remained stable.\",\"PeriodicalId\":48001,\"journal\":{\"name\":\"Journal of Educational and Behavioral Statistics\",\"volume\":\"46 1\",\"pages\":\"209 - 218\"},\"PeriodicalIF\":1.9000,\"publicationDate\":\"2021-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Educational and Behavioral Statistics\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.3102/1076998621994540\",\"RegionNum\":3,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"EDUCATION & EDUCATIONAL RESEARCH\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Educational and Behavioral Statistics","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.3102/1076998621994540","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 1

摘要

在本期中,Reardon、Kalogrides和Ho开发了精度调整的随机效应模型,以估计种群和亚种群的总体水平连接误差,以及随时间的平均值和进展。我们感谢前任编辑Dan McCaffrey选择我们的论文作为我们同事Daniel Bolt、Mark Davison、Alina von Davier、Tim Moses和Neil Dorans的一系列评论的焦点文章。这些评论加强了重要的注意事项,并为未来的研究指明了有希望的方向。在这篇反驳中,我们澄清了我们最初提出的方法的各个方面。(1) 验证方法提供了利益和风险的证据,不同的专家可能会出于不同的目的对其进行不同的权衡。(2) 我们提出的方法与使用国家教育进步评估的“标准映射”程序的不同之处不仅在于使用线性(与等百分比)链接,还在于针对反事实总分的直接有效性证据。(3) 假设各州的评分标准相同的多层次方法确实是下一步验证的好方法,我们希望各州能够让研究人员为此目的使用更多的共同核心时代联盟测试数据。最后,我们将我们的链接方法应用于2009年至2017年的一组扩展数据,以表明链接恢复保持稳定。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Validation Methods for Aggregate-Level Test Scale Linking: A Rejoinder
In this issue, Reardon, Kalogrides, and Ho developed precision-adjusted random effects models to estimate aggregate-level linking error, for populations and subpopulations, for averages and progress over time. We are grateful to past editor Dan McCaffrey for selecting our paper as the focal article for a set of commentaries from our colleagues Daniel Bolt, Mark Davison, Alina von Davier, Tim Moses, and Neil Dorans. These commentaries reinforce important cautions and identify promising directions for future research. In this rejoinder, we clarify aspects of our originally proposed method. (1) Validation methods provide evidence of benefits and risks that different experts may weigh differently for different purposes. (2) Our proposed method differs from “standard mapping” procedures using the National Assessment of Educational Progress not only by using a linear (vs. equipercentile) link but also by targeting direct validity evidence about counterfactual aggregate scores. (3) Multilevel approaches that assume common score scales across states are indeed a promising next step for validation, and we hope that states enable researchers to use more of their common-core-era consortium test data for this purpose. Finally, we apply our linking method to an extended panel of data from 2009 to 2017 to show that linking recovery has remained stable.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
4.40
自引率
4.20%
发文量
21
期刊介绍: Journal of Educational and Behavioral Statistics, sponsored jointly by the American Educational Research Association and the American Statistical Association, publishes articles that are original and provide methods that are useful to those studying problems and issues in educational or behavioral research. Typical papers introduce new methods of analysis. Critical reviews of current practice, tutorial presentations of less well known methods, and novel applications of already-known methods are also of interest. Papers discussing statistical techniques without specific educational or behavioral interest or focusing on substantive results without developing new statistical methods or models or making novel use of existing methods have lower priority. Simulation studies, either to demonstrate properties of an existing method or to compare several existing methods (without providing a new method), also have low priority. The Journal of Educational and Behavioral Statistics provides an outlet for papers that are original and provide methods that are useful to those studying problems and issues in educational or behavioral research. Typical papers introduce new methods of analysis, provide properties of these methods, and an example of use in education or behavioral research. Critical reviews of current practice, tutorial presentations of less well known methods, and novel applications of already-known methods are also sometimes accepted. Papers discussing statistical techniques without specific educational or behavioral interest or focusing on substantive results without developing new statistical methods or models or making novel use of existing methods have lower priority. Simulation studies, either to demonstrate properties of an existing method or to compare several existing methods (without providing a new method), also have low priority.
期刊最新文献
Improving Balance in Educational Measurement: A Legacy of E. F. Lindquist A Simple Technique Assessing Ordinal and Disordinal Interaction Effects A Comparison of Latent Semantic Analysis and Latent Dirichlet Allocation in Educational Measurement Sample Size Calculation and Optimal Design for Multivariate Regression-Based Norming Corrigendum to Power Approximations for Overall Average Effects in Meta-Analysis With Dependent Effect Sizes
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1