{"title":"Commentary: Modernizing Educational Assessment Training for Changing Job Markets","authors":"André A. Rupp","doi":"10.1111/emip.12629","DOIUrl":"10.1111/emip.12629","url":null,"abstract":"","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"43 3","pages":"33-38"},"PeriodicalIF":2.7,"publicationDate":"2024-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142211659","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Andrew D. Ho, Terry A. Ackerman, Deborah L. Bandalos, Derek C. Briggs, Howard T. Everson, Susan M. Lottridge, Matthew J. Madison, Sandip Sinharay, Michael C. Rodriguez, Michael Russell, Alina A. von Davier, Stefanie A. Wind
What are foundational competencies in educational measurement? We published a framework for these foundational competencies in this journal (Ackerman et al. 2024) and were grateful to receive eight commentaries raising a number of important questions about the framework and its implications. We identified five cross-cutting recommendations among the eight commentaries relating to (1) our process and purpose, (2) Artificial Intelligence, (3) ethical competencies, (4) qualitative, critical, and culturally responsive commentaries, and (5) intersecting professions in, for example, classroom assessment and content development. In this rejoinder, we respond to these five recommendations and to each of the eight commentaries in turn. We hope that discussion and consensus about foundational competencies in educational measurement continue to advance in our journals and our field.
{"title":"Foundational Competencies in Educational Measurement: A Rejoinder","authors":"Andrew D. Ho, Terry A. Ackerman, Deborah L. Bandalos, Derek C. Briggs, Howard T. Everson, Susan M. Lottridge, Matthew J. Madison, Sandip Sinharay, Michael C. Rodriguez, Michael Russell, Alina A. von Davier, Stefanie A. Wind","doi":"10.1111/emip.12623","DOIUrl":"10.1111/emip.12623","url":null,"abstract":"<p>What are foundational competencies in educational measurement? We published a framework for these foundational competencies in this journal (Ackerman et al. 2024) and were grateful to receive eight commentaries raising a number of important questions about the framework and its implications. We identified five cross-cutting recommendations among the eight commentaries relating to (1) our process and purpose, (2) Artificial Intelligence, (3) ethical competencies, (4) qualitative, critical, and culturally responsive commentaries, and (5) intersecting professions in, for example, classroom assessment and content development. In this rejoinder, we respond to these five recommendations and to each of the eight commentaries in turn. We hope that discussion and consensus about foundational competencies in educational measurement continue to advance in our journals and our field.</p>","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"43 3","pages":"56-63"},"PeriodicalIF":2.7,"publicationDate":"2024-08-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142211679","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Dustin S. J. Van Orman, Janine A. Jackson, Thao T. Vo, Darius D. Taylor
The “Foundational Competencies in Educational Measurement” framework aims to shape the field's future. However, the absence of emerging scholars and graduate students in the task force highlights a gap in representing those most familiar with current educational landscape. As early career scholars, we offer perspectives to enhance this framework by focusing on making educational measurement more inclusive, collaborative, and culturally responsive. Drawing on our diverse backgrounds and experiences, we propose expanding the framework to empower measurement professionals, diversify measurement practices, and integrate ethical considerations. We also advocate for a new visual representation of the framework as a budding plant, symbolizing the organic and evolving nature of foundational skills in educational measurement. This commentary aims to refine the foundational competencies to better prepare future professionals for meaningful, equitable educational contributions.
{"title":"Commentary: Perspectives of Early Career Professionals on Enhancing Cultural Responsiveness in Educational Measurement","authors":"Dustin S. J. Van Orman, Janine A. Jackson, Thao T. Vo, Darius D. Taylor","doi":"10.1111/emip.12628","DOIUrl":"10.1111/emip.12628","url":null,"abstract":"<p>The “Foundational Competencies in Educational Measurement” framework aims to shape the field's future. However, the absence of emerging scholars and graduate students in the task force highlights a gap in representing those most familiar with current educational landscape. As early career scholars, we offer perspectives to enhance this framework by focusing on making educational measurement more inclusive, collaborative, and culturally responsive. Drawing on our diverse backgrounds and experiences, we propose expanding the framework to empower measurement professionals, diversify measurement practices, and integrate ethical considerations. We also advocate for a new visual representation of the framework as a budding plant, symbolizing the organic and evolving nature of foundational skills in educational measurement. This commentary aims to refine the foundational competencies to better prepare future professionals for meaningful, equitable educational contributions.</p>","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"43 3","pages":"27-32"},"PeriodicalIF":2.7,"publicationDate":"2024-08-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/emip.12628","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142211678","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In recent years, educators, administrators, policymakers, and measurement experts have called for assessments that support educators in making better instructional decisions. One promising approach to measurement to support instructional decision‐making is diagnostic classification models (DCMs). DCMs are flexible psychometric models that facilitate fine‐grained reporting on skills that students have mastered. In this article, we describe how DCMs can be leveraged to support better decision‐making. We first provide a high‐level overview of DCMs. We then describe different methods for reporting results from DCM‐based assessments that support decision‐making for different stakeholder groups. We close with a discussion of considerations for implementing DCMs in an operational setting, including how they can inform decision‐making at state and local levels, and share future directions for research.
{"title":"Improving Instructional Decision‐Making Using Diagnostic Classification Models","authors":"W. Jake Thompson, Amy K. Clark","doi":"10.1111/emip.12619","DOIUrl":"https://doi.org/10.1111/emip.12619","url":null,"abstract":"In recent years, educators, administrators, policymakers, and measurement experts have called for assessments that support educators in making better instructional decisions. One promising approach to measurement to support instructional decision‐making is diagnostic classification models (DCMs). DCMs are flexible psychometric models that facilitate fine‐grained reporting on skills that students have mastered. In this article, we describe how DCMs can be leveraged to support better decision‐making. We first provide a high‐level overview of DCMs. We then describe different methods for reporting results from DCM‐based assessments that support decision‐making for different stakeholder groups. We close with a discussion of considerations for implementing DCMs in an operational setting, including how they can inform decision‐making at state and local levels, and share future directions for research.","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"28 1","pages":""},"PeriodicalIF":2.0,"publicationDate":"2024-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141509562","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}