Pub Date : 2023-04-03DOI: 10.1080/10627197.2023.2215980
J. Herman, J. Martínez, A. Bailey
ABSTRACT This special issue of Educational Assessment seeks to encourage reflection and discussion around the different assumptions and conceptualizations of fairness in assessment and their potential links to and implications for the next edition of the Standards for Educational and Psychological Testing. In this final commentary, the special issue editors summarize the major points advanced by the three contributing authors, and consider the variety of conceptual, methodological, and practical challenges and questions raised. We discuss a range of remaining issues requiring additional theorizing and empirical research to further illuminate and bring the ideas of the contributors to fruition. Finally, we highlight areas with direct implications to be considered in the development of the next edition of the Standards.
{"title":"Fairness in Educational Assessment and the Next Edition of the Standards: Concluding Commentary","authors":"J. Herman, J. Martínez, A. Bailey","doi":"10.1080/10627197.2023.2215980","DOIUrl":"https://doi.org/10.1080/10627197.2023.2215980","url":null,"abstract":"ABSTRACT This special issue of Educational Assessment seeks to encourage reflection and discussion around the different assumptions and conceptualizations of fairness in assessment and their potential links to and implications for the next edition of the Standards for Educational and Psychological Testing. In this final commentary, the special issue editors summarize the major points advanced by the three contributing authors, and consider the variety of conceptual, methodological, and practical challenges and questions raised. We discuss a range of remaining issues requiring additional theorizing and empirical research to further illuminate and bring the ideas of the contributors to fruition. Finally, we highlight areas with direct implications to be considered in the development of the next edition of the Standards.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":null,"pages":null},"PeriodicalIF":1.5,"publicationDate":"2023-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42510549","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-03DOI: 10.1080/10627197.2023.2215979
J. Herman, A. Bailey, J. Martínez
ABSTRACT This introduction provides context for Educational Assessment’s special issue, ”Fairness in Educational Assessment and the Next Edition of the Standards.” The article introduces the topic of fairness by citing a prior Special Issue on which the current issue is built, summarizes the current Fairness Standards of the Standards for Educational and Psychological Testing (2014) and provides an overview to the issue. The issue includes focal articles by Dr. Jennifer Randall and Dr. Randy Bennett and a synthesis discussion by Dr. Guillermo Solano-Flores. The two focal authors then respond to Dr. Solano-Flores and the special issue editors end the issue with a concluding commentary.
{"title":"Introduction to the Special Issue: Fairness in Educational Assessment and the Next Edition of the Standards","authors":"J. Herman, A. Bailey, J. Martínez","doi":"10.1080/10627197.2023.2215979","DOIUrl":"https://doi.org/10.1080/10627197.2023.2215979","url":null,"abstract":"ABSTRACT This introduction provides context for Educational Assessment’s special issue, ”Fairness in Educational Assessment and the Next Edition of the Standards.” The article introduces the topic of fairness by citing a prior Special Issue on which the current issue is built, summarizes the current Fairness Standards of the Standards for Educational and Psychological Testing (2014) and provides an overview to the issue. The issue includes focal articles by Dr. Jennifer Randall and Dr. Randy Bennett and a synthesis discussion by Dr. Guillermo Solano-Flores. The two focal authors then respond to Dr. Solano-Flores and the special issue editors end the issue with a concluding commentary.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":null,"pages":null},"PeriodicalIF":1.5,"publicationDate":"2023-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49230336","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-03DOI: 10.1080/10627197.2023.2215978
R. Bennett
ABSTRACT “Toward a Theory of Socioculturally Responsive Assessment” assembled design principles from multiple literatures and wove them into a working definition and a network of empirically testable propositions. The intention was to offer a coherent theoretical framework within which to understand why and how particular assessment designs might work, what actions testing programs should consider, how they might move forward with those actions, and how to evaluate the impact. Dr. Solano Flores offers many comments on these ideas, with which I mostly agree. In this response, I detail those agreements, as well as some points of departure. I close with some implications for revising the Standards.
{"title":"Let’s Agree to (Mostly) Agree: A Response to Solano-Flores","authors":"R. Bennett","doi":"10.1080/10627197.2023.2215978","DOIUrl":"https://doi.org/10.1080/10627197.2023.2215978","url":null,"abstract":"ABSTRACT “Toward a Theory of Socioculturally Responsive Assessment” assembled design principles from multiple literatures and wove them into a working definition and a network of empirically testable propositions. The intention was to offer a coherent theoretical framework within which to understand why and how particular assessment designs might work, what actions testing programs should consider, how they might move forward with those actions, and how to evaluate the impact. Dr. Solano Flores offers many comments on these ideas, with which I mostly agree. In this response, I detail those agreements, as well as some points of departure. I close with some implications for revising the Standards.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":null,"pages":null},"PeriodicalIF":1.5,"publicationDate":"2023-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47790677","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-03DOI: 10.1080/10627197.2023.2202312
R. Bennett
ABSTRACT In the United States, opposition to traditional standardized tests is widespread, particularly obvious in the admissions context but also evident in elementary and secondary education. This opposition is fueled in significant part by the perception that tests perpetuate social injustice through their content, design, and use. To survive, as well as contribute positively, the measurement field must rethink assessment, including how to make it more socioculturally responsive. This paper offers a rationale for that rethinking and then employs provisional design principles drawn from various literatures to formulate a working definition and the beginnings of a theory. In the closing section, a path toward implementation is suggested.
{"title":"Toward a Theory of Socioculturally Responsive Assessment","authors":"R. Bennett","doi":"10.1080/10627197.2023.2202312","DOIUrl":"https://doi.org/10.1080/10627197.2023.2202312","url":null,"abstract":"ABSTRACT In the United States, opposition to traditional standardized tests is widespread, particularly obvious in the admissions context but also evident in elementary and secondary education. This opposition is fueled in significant part by the perception that tests perpetuate social injustice through their content, design, and use. To survive, as well as contribute positively, the measurement field must rethink assessment, including how to make it more socioculturally responsive. This paper offers a rationale for that rethinking and then employs provisional design principles drawn from various literatures to formulate a working definition and the beginnings of a theory. In the closing section, a path toward implementation is suggested.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":null,"pages":null},"PeriodicalIF":1.5,"publicationDate":"2023-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43214871","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-03DOI: 10.1080/10627197.2023.2226388
Guillermo Solano-Flores
ABSTRACT Jennifer Randall’s paper on justice-oriented assessment and Randy Bennett’s paper on socioculturally responsive assessment address fairness in the testing of racially, culturally, and linguistically diverse student populations by providing principles and recommendations for improved assessment practice. I warn about the perils of assuming that principles and recommendations suffice to promote fair testing in the absence of serious changes in the entire process of assessment. I liken the limitations of this over-reliance on principles and recommendations to the limitations of the fairness chapter of the Standards for Educational and Psychological Testing, whose wording portraits actions to address fairness in testing as optional. A transformative agenda on assessment practice needs to be based on a systemic perspective that involves all components and stages in the assessment process and needs to aim to produce a paradigm shift that establishes more rigorous expectations about what counts as fairness in assessment.
{"title":"How Serious are We About Fairness in Testing and How Far are We Willing to Go? A Response to Randall and Bennett with Reflections About the Standards for Educational and Psychological Testing","authors":"Guillermo Solano-Flores","doi":"10.1080/10627197.2023.2226388","DOIUrl":"https://doi.org/10.1080/10627197.2023.2226388","url":null,"abstract":"ABSTRACT Jennifer Randall’s paper on justice-oriented assessment and Randy Bennett’s paper on socioculturally responsive assessment address fairness in the testing of racially, culturally, and linguistically diverse student populations by providing principles and recommendations for improved assessment practice. I warn about the perils of assuming that principles and recommendations suffice to promote fair testing in the absence of serious changes in the entire process of assessment. I liken the limitations of this over-reliance on principles and recommendations to the limitations of the fairness chapter of the Standards for Educational and Psychological Testing, whose wording portraits actions to address fairness in testing as optional. A transformative agenda on assessment practice needs to be based on a systemic perspective that involves all components and stages in the assessment process and needs to aim to produce a paradigm shift that establishes more rigorous expectations about what counts as fairness in assessment.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":null,"pages":null},"PeriodicalIF":1.5,"publicationDate":"2023-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49328403","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-03DOI: 10.1080/10627197.2023.2212900
Jennifer Randall
ABSTRACT It Ain't Near 'Bout Fair: Re-envisioning the Bias and Sensitivity Review Process from a Justice-Oriented Antiracist Perspective was intended to facilitate conversation in the field about the bias and sensitivity process, specifically. I argue that our current approaches rely far too heavily on fear-based notions to the exclusion of justice-based aims. As Dr. Solano-Flores points out, however, this conversation must be considered in concert with a larger conversation around principles of equity and justice in the assessment design and development process in order for the necessary transformational change to occur. In this paper, I consider Dr. Solano-Flores’ ideas and suggest a path forward with these ideas in mind.
{"title":"Response to Solano-Flores: How Serious are We About Fairness in Testing and How Far are We Willing to Go?","authors":"Jennifer Randall","doi":"10.1080/10627197.2023.2212900","DOIUrl":"https://doi.org/10.1080/10627197.2023.2212900","url":null,"abstract":"ABSTRACT It Ain't Near 'Bout Fair: Re-envisioning the Bias and Sensitivity Review Process from a Justice-Oriented Antiracist Perspective was intended to facilitate conversation in the field about the bias and sensitivity process, specifically. I argue that our current approaches rely far too heavily on fear-based notions to the exclusion of justice-based aims. As Dr. Solano-Flores points out, however, this conversation must be considered in concert with a larger conversation around principles of equity and justice in the assessment design and development process in order for the necessary transformational change to occur. In this paper, I consider Dr. Solano-Flores’ ideas and suggest a path forward with these ideas in mind.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":null,"pages":null},"PeriodicalIF":1.5,"publicationDate":"2023-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44653819","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-10-25DOI: 10.1080/10627197.2022.2138322
O. Bulut, H. Bulut, D. Cormier, Munevver Ilgun Dibek, Merve Sahin Kursad
ABSTRACT Some statewide testing programs allow students to receive corrective feedback and revise their answers during testing. Despite its pedagogical benefits, the effects of providing revision opportunities remain unknown in the context of alternate assessments. Therefore, this study examined student data from a large-scale alternate assessment that allows students to make multiple attempts until they find the correct answer to multiple-choice items. The students receive partial credit based on the number of attempts being made. The effects of the multiple-attempt approach on both test characteristics and student performance were investigated. The results indicated that, despite making most items on the assessment relatively easier, the availability of partial credit improved the strength of the items in distinguishing low-achieving and high-achieving students while maintaining high internal consistency among the test items. Although the students were able to increase their scores due to the inclusion of partial credit based on the number of attempts, the relative positions of the students remained nearly the same.
{"title":"The Effects of Providing Students with Revision Opportunities in Alternate Assessments","authors":"O. Bulut, H. Bulut, D. Cormier, Munevver Ilgun Dibek, Merve Sahin Kursad","doi":"10.1080/10627197.2022.2138322","DOIUrl":"https://doi.org/10.1080/10627197.2022.2138322","url":null,"abstract":"ABSTRACT Some statewide testing programs allow students to receive corrective feedback and revise their answers during testing. Despite its pedagogical benefits, the effects of providing revision opportunities remain unknown in the context of alternate assessments. Therefore, this study examined student data from a large-scale alternate assessment that allows students to make multiple attempts until they find the correct answer to multiple-choice items. The students receive partial credit based on the number of attempts being made. The effects of the multiple-attempt approach on both test characteristics and student performance were investigated. The results indicated that, despite making most items on the assessment relatively easier, the availability of partial credit improved the strength of the items in distinguishing low-achieving and high-achieving students while maintaining high internal consistency among the test items. Although the students were able to increase their scores due to the inclusion of partial credit based on the number of attempts, the relative positions of the students remained nearly the same.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":null,"pages":null},"PeriodicalIF":1.5,"publicationDate":"2022-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45274412","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-10-06DOI: 10.1080/10627197.2022.2130747
Lilla Németh, László Bernáth
ABSTRACT The Cognitive Test Anxiety Scale (CTAS) is a unidimensional scale designed to measure the cognitive aspect of test anxiety. The instrument has been adapted in several countries, and convincing psychometric properties have been found; however, uncertainties remain regarding its factor structure. Therefore, the aim of this study is twofold: to revise the instrument’s factor structure and to investigate the state or trait nature of the construct. The results of exploratory and confirmatory factor analyses suggest that the CTAS includes three dimensions: general worry, freezing up, and fear of failure. The reliability measures of the subscales showed appropriate values and validity evidence supported the multidimensionality of the CTAS. Finally, the state or trait nature of the construct was studied through an investigation of the effect taking an exam before the test’s administration has on CTAS scores. Results imply that cognitive test anxiety measured by the CTAS should be considered as a trait.
{"title":"The Nature of Cognitive Test Anxiety: An Investigation of the Factor Structure of the Cognitive Test Anxiety Scale","authors":"Lilla Németh, László Bernáth","doi":"10.1080/10627197.2022.2130747","DOIUrl":"https://doi.org/10.1080/10627197.2022.2130747","url":null,"abstract":"ABSTRACT The Cognitive Test Anxiety Scale (CTAS) is a unidimensional scale designed to measure the cognitive aspect of test anxiety. The instrument has been adapted in several countries, and convincing psychometric properties have been found; however, uncertainties remain regarding its factor structure. Therefore, the aim of this study is twofold: to revise the instrument’s factor structure and to investigate the state or trait nature of the construct. The results of exploratory and confirmatory factor analyses suggest that the CTAS includes three dimensions: general worry, freezing up, and fear of failure. The reliability measures of the subscales showed appropriate values and validity evidence supported the multidimensionality of the CTAS. Finally, the state or trait nature of the construct was studied through an investigation of the effect taking an exam before the test’s administration has on CTAS scores. Results imply that cognitive test anxiety measured by the CTAS should be considered as a trait.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":null,"pages":null},"PeriodicalIF":1.5,"publicationDate":"2022-10-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41968072","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-10-03DOI: 10.1080/10627197.2022.2122953
L. Sandvik, B. Svendsen, Alex Strømme, Kari Smith, Oda Aasmundstad Sommervold, Stine Aarønes Angvik
ABSTRACT The lockdowns that began during the spring of 2020 changed the conditions for teaching and assessment across the globe. In Norway, schools were closed, and all school activities took place online. Moreover, all final exams were canceled, and all student grading was based on final grading by the individual teacher. Because of this, teachers’ assessment skills became more important. This study examines students’ and teachers’ experiences of assessment during the lockdown period. The findings revealed that students got little support from the teacher in their learning process; they worked alone and felt insecure about assessment. Teacher collaboration about assessment seemed sporadic and the assessment routines were weak. The study raises concerns about equity in education when teachers have problems implementing assessment practices that support students’ learning.
{"title":"Assessment during COVID-19: Students and Teachers in Limbo When the Classroom Disappeared","authors":"L. Sandvik, B. Svendsen, Alex Strømme, Kari Smith, Oda Aasmundstad Sommervold, Stine Aarønes Angvik","doi":"10.1080/10627197.2022.2122953","DOIUrl":"https://doi.org/10.1080/10627197.2022.2122953","url":null,"abstract":"ABSTRACT The lockdowns that began during the spring of 2020 changed the conditions for teaching and assessment across the globe. In Norway, schools were closed, and all school activities took place online. Moreover, all final exams were canceled, and all student grading was based on final grading by the individual teacher. Because of this, teachers’ assessment skills became more important. This study examines students’ and teachers’ experiences of assessment during the lockdown period. The findings revealed that students got little support from the teacher in their learning process; they worked alone and felt insecure about assessment. Teacher collaboration about assessment seemed sporadic and the assessment routines were weak. The study raises concerns about equity in education when teachers have problems implementing assessment practices that support students’ learning.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":null,"pages":null},"PeriodicalIF":1.5,"publicationDate":"2022-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42412466","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-08-25DOI: 10.1080/10627197.2022.2110465
Joseph A. Rios, Jiayi Deng, Samuel D. Ihlenfeldt
ABSTRACT The present meta-analysis sought to quantify the average degree of aggregated test score distortion due to rapid guessing (RG). Included studies group-administered a low-stakes cognitive assessment, identified RG via response times, and reported the rate of examinees engaging in RG, the percentage of RG responses observed, and/or the degree of score distortion in aggregated test scores due to RG. The final sample consisted of 25 studies and 39 independent samples comprised of 443,264 unique examinees. Results demonstrated that an average of 28.3% of examinees engaged in RG (21% were deemed to engage in RG on a nonnegligible number of items) and 6.89% of item responses were classified as rapid guesses. Across 100 effect sizes, RG was found to negatively distort aggregated test scores by an average of 0.13 standard deviations; however, this relationship was moderated by both test content area and filtering procedure.
{"title":"To What Degree Does Rapid Guessing Distort Aggregated Test Scores? A Meta-analytic Investigation","authors":"Joseph A. Rios, Jiayi Deng, Samuel D. Ihlenfeldt","doi":"10.1080/10627197.2022.2110465","DOIUrl":"https://doi.org/10.1080/10627197.2022.2110465","url":null,"abstract":"ABSTRACT The present meta-analysis sought to quantify the average degree of aggregated test score distortion due to rapid guessing (RG). Included studies group-administered a low-stakes cognitive assessment, identified RG via response times, and reported the rate of examinees engaging in RG, the percentage of RG responses observed, and/or the degree of score distortion in aggregated test scores due to RG. The final sample consisted of 25 studies and 39 independent samples comprised of 443,264 unique examinees. Results demonstrated that an average of 28.3% of examinees engaged in RG (21% were deemed to engage in RG on a nonnegligible number of items) and 6.89% of item responses were classified as rapid guesses. Across 100 effect sizes, RG was found to negatively distort aggregated test scores by an average of 0.13 standard deviations; however, this relationship was moderated by both test content area and filtering procedure.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":null,"pages":null},"PeriodicalIF":1.5,"publicationDate":"2022-08-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44583236","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}