Kamal Middlebrook, Laura S. Hamilton, Michael E. Walker
{"title":"Commentary: How Research and Testing Companies can Support Early-Career Measurement Professionals","authors":"Kamal Middlebrook, Laura S. Hamilton, Michael E. Walker","doi":"10.1111/emip.12631","DOIUrl":"10.1111/emip.12631","url":null,"abstract":"","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"43 3","pages":"45-48"},"PeriodicalIF":2.7,"publicationDate":"2024-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142211658","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Commentary: Modernizing Educational Assessment Training for Changing Job Markets","authors":"André A. Rupp","doi":"10.1111/emip.12629","DOIUrl":"10.1111/emip.12629","url":null,"abstract":"","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"43 3","pages":"33-38"},"PeriodicalIF":2.7,"publicationDate":"2024-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142211659","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Andrew D. Ho, Terry A. Ackerman, Deborah L. Bandalos, Derek C. Briggs, Howard T. Everson, Susan M. Lottridge, Matthew J. Madison, Sandip Sinharay, Michael C. Rodriguez, Michael Russell, Alina A. von Davier, Stefanie A. Wind
What are foundational competencies in educational measurement? We published a framework for these foundational competencies in this journal (Ackerman et al. 2024) and were grateful to receive eight commentaries raising a number of important questions about the framework and its implications. We identified five cross-cutting recommendations among the eight commentaries relating to (1) our process and purpose, (2) Artificial Intelligence, (3) ethical competencies, (4) qualitative, critical, and culturally responsive commentaries, and (5) intersecting professions in, for example, classroom assessment and content development. In this rejoinder, we respond to these five recommendations and to each of the eight commentaries in turn. We hope that discussion and consensus about foundational competencies in educational measurement continue to advance in our journals and our field.
{"title":"Foundational Competencies in Educational Measurement: A Rejoinder","authors":"Andrew D. Ho, Terry A. Ackerman, Deborah L. Bandalos, Derek C. Briggs, Howard T. Everson, Susan M. Lottridge, Matthew J. Madison, Sandip Sinharay, Michael C. Rodriguez, Michael Russell, Alina A. von Davier, Stefanie A. Wind","doi":"10.1111/emip.12623","DOIUrl":"10.1111/emip.12623","url":null,"abstract":"<p>What are foundational competencies in educational measurement? We published a framework for these foundational competencies in this journal (Ackerman et al. 2024) and were grateful to receive eight commentaries raising a number of important questions about the framework and its implications. We identified five cross-cutting recommendations among the eight commentaries relating to (1) our process and purpose, (2) Artificial Intelligence, (3) ethical competencies, (4) qualitative, critical, and culturally responsive commentaries, and (5) intersecting professions in, for example, classroom assessment and content development. In this rejoinder, we respond to these five recommendations and to each of the eight commentaries in turn. We hope that discussion and consensus about foundational competencies in educational measurement continue to advance in our journals and our field.</p>","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"43 3","pages":"56-63"},"PeriodicalIF":2.7,"publicationDate":"2024-08-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142211679","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Dustin S. J. Van Orman, Janine A. Jackson, Thao T. Vo, Darius D. Taylor
The “Foundational Competencies in Educational Measurement” framework aims to shape the field's future. However, the absence of emerging scholars and graduate students in the task force highlights a gap in representing those most familiar with current educational landscape. As early career scholars, we offer perspectives to enhance this framework by focusing on making educational measurement more inclusive, collaborative, and culturally responsive. Drawing on our diverse backgrounds and experiences, we propose expanding the framework to empower measurement professionals, diversify measurement practices, and integrate ethical considerations. We also advocate for a new visual representation of the framework as a budding plant, symbolizing the organic and evolving nature of foundational skills in educational measurement. This commentary aims to refine the foundational competencies to better prepare future professionals for meaningful, equitable educational contributions.
{"title":"Commentary: Perspectives of Early Career Professionals on Enhancing Cultural Responsiveness in Educational Measurement","authors":"Dustin S. J. Van Orman, Janine A. Jackson, Thao T. Vo, Darius D. Taylor","doi":"10.1111/emip.12628","DOIUrl":"10.1111/emip.12628","url":null,"abstract":"<p>The “Foundational Competencies in Educational Measurement” framework aims to shape the field's future. However, the absence of emerging scholars and graduate students in the task force highlights a gap in representing those most familiar with current educational landscape. As early career scholars, we offer perspectives to enhance this framework by focusing on making educational measurement more inclusive, collaborative, and culturally responsive. Drawing on our diverse backgrounds and experiences, we propose expanding the framework to empower measurement professionals, diversify measurement practices, and integrate ethical considerations. We also advocate for a new visual representation of the framework as a budding plant, symbolizing the organic and evolving nature of foundational skills in educational measurement. This commentary aims to refine the foundational competencies to better prepare future professionals for meaningful, equitable educational contributions.</p>","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"43 3","pages":"27-32"},"PeriodicalIF":2.7,"publicationDate":"2024-08-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/emip.12628","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142211678","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In recent years, educators, administrators, policymakers, and measurement experts have called for assessments that support educators in making better instructional decisions. One promising approach to measurement to support instructional decision-making is diagnostic classification models (DCMs). DCMs are flexible psychometric models that facilitate fine-grained reporting on skills that students have mastered. In this article, we describe how DCMs can be leveraged to support better decision-making. We first provide a high-level overview of DCMs. We then describe different methods for reporting results from DCM-based assessments that support decision-making for different stakeholder groups. We close with a discussion of considerations for implementing DCMs in an operational setting, including how they can inform decision-making at state and local levels, and share future directions for research.
{"title":"Improving Instructional Decision-Making Using Diagnostic Classification Models","authors":"W. Jake Thompson, Amy K. Clark","doi":"10.1111/emip.12619","DOIUrl":"10.1111/emip.12619","url":null,"abstract":"<p>In recent years, educators, administrators, policymakers, and measurement experts have called for assessments that support educators in making better instructional decisions. One promising approach to measurement to support instructional decision-making is diagnostic classification models (DCMs). DCMs are flexible psychometric models that facilitate fine-grained reporting on skills that students have mastered. In this article, we describe how DCMs can be leveraged to support better decision-making. We first provide a high-level overview of DCMs. We then describe different methods for reporting results from DCM-based assessments that support decision-making for different stakeholder groups. We close with a discussion of considerations for implementing DCMs in an operational setting, including how they can inform decision-making at state and local levels, and share future directions for research.</p>","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"43 4","pages":"146-156"},"PeriodicalIF":2.7,"publicationDate":"2024-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141509562","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Xuelan Qiu, Jimmy de la Torre, You-Gan Wang, Jinran Wu
Multidimensional forced-choice (MFC) items have been found to be useful to reduce response biases in personality assessments. However, conventional scoring methods for the MFC items result in ipsative data, hindering the wider applications of the MFC format. In the last decade, a number of item response theory (IRT) models have been developed, majority of which are for MFC items with binary responses. However, MFC items with polytomous responses are more informative and have many applications. This paper develops a polytomous Rasch ipsative model (pRIM) that can deal with ipsative data and yield estimates that measure construct differentiation—a latent trait that describes the degree to which the personality constructs (e.g., interests) distinguish between each other. The pRIM and its simpler form are applied to a career interests assessment containing four-category MFC items and the measures of interests differentiation are used for both intra- and interpersonal comparisons. Simulations are conducted to examine the recovery of the parameters under various conditions. The results show that the parameters of the pRIM can be well recovered, particularly when a complete linking design and a large sample are used. The implications and application of the pRIM in the personality assessment using MFC items are discussed.
{"title":"Item Response Theory Models for Polytomous Multidimensional Forced-Choice Items to Measure Construct Differentiation","authors":"Xuelan Qiu, Jimmy de la Torre, You-Gan Wang, Jinran Wu","doi":"10.1111/emip.12621","DOIUrl":"10.1111/emip.12621","url":null,"abstract":"<p>Multidimensional forced-choice (MFC) items have been found to be useful to reduce response biases in personality assessments. However, conventional scoring methods for the MFC items result in ipsative data, hindering the wider applications of the MFC format. In the last decade, a number of item response theory (IRT) models have been developed, majority of which are for MFC items with binary responses. However, MFC items with polytomous responses are more informative and have many applications. This paper develops a polytomous Rasch ipsative model (pRIM) that can deal with ipsative data and yield estimates that measure construct differentiation—a latent trait that describes the degree to which the personality constructs (e.g., interests) distinguish between each other. The pRIM and its simpler form are applied to a career interests assessment containing four-category MFC items and the measures of interests differentiation are used for both intra- and interpersonal comparisons. Simulations are conducted to examine the recovery of the parameters under various conditions. The results show that the parameters of the pRIM can be well recovered, particularly when a complete linking design and a large sample are used. The implications and application of the pRIM in the personality assessment using MFC items are discussed.</p>","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"43 4","pages":"157-168"},"PeriodicalIF":2.7,"publicationDate":"2024-06-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141362892","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}