{"title":"快速猜测在多大程度上扭曲了综合测试成绩?元分析研究","authors":"Joseph A. Rios, Jiayi Deng, Samuel D. Ihlenfeldt","doi":"10.1080/10627197.2022.2110465","DOIUrl":null,"url":null,"abstract":"ABSTRACT The present meta-analysis sought to quantify the average degree of aggregated test score distortion due to rapid guessing (RG). Included studies group-administered a low-stakes cognitive assessment, identified RG via response times, and reported the rate of examinees engaging in RG, the percentage of RG responses observed, and/or the degree of score distortion in aggregated test scores due to RG. The final sample consisted of 25 studies and 39 independent samples comprised of 443,264 unique examinees. Results demonstrated that an average of 28.3% of examinees engaged in RG (21% were deemed to engage in RG on a nonnegligible number of items) and 6.89% of item responses were classified as rapid guesses. Across 100 effect sizes, RG was found to negatively distort aggregated test scores by an average of 0.13 standard deviations; however, this relationship was moderated by both test content area and filtering procedure.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"27 1","pages":"356 - 373"},"PeriodicalIF":2.1000,"publicationDate":"2022-08-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"To What Degree Does Rapid Guessing Distort Aggregated Test Scores? A Meta-analytic Investigation\",\"authors\":\"Joseph A. Rios, Jiayi Deng, Samuel D. Ihlenfeldt\",\"doi\":\"10.1080/10627197.2022.2110465\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"ABSTRACT The present meta-analysis sought to quantify the average degree of aggregated test score distortion due to rapid guessing (RG). Included studies group-administered a low-stakes cognitive assessment, identified RG via response times, and reported the rate of examinees engaging in RG, the percentage of RG responses observed, and/or the degree of score distortion in aggregated test scores due to RG. The final sample consisted of 25 studies and 39 independent samples comprised of 443,264 unique examinees. Results demonstrated that an average of 28.3% of examinees engaged in RG (21% were deemed to engage in RG on a nonnegligible number of items) and 6.89% of item responses were classified as rapid guesses. Across 100 effect sizes, RG was found to negatively distort aggregated test scores by an average of 0.13 standard deviations; however, this relationship was moderated by both test content area and filtering procedure.\",\"PeriodicalId\":46209,\"journal\":{\"name\":\"Educational Assessment\",\"volume\":\"27 1\",\"pages\":\"356 - 373\"},\"PeriodicalIF\":2.1000,\"publicationDate\":\"2022-08-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Educational Assessment\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1080/10627197.2022.2110465\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"EDUCATION & EDUCATIONAL RESEARCH\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Educational Assessment","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/10627197.2022.2110465","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
To What Degree Does Rapid Guessing Distort Aggregated Test Scores? A Meta-analytic Investigation
ABSTRACT The present meta-analysis sought to quantify the average degree of aggregated test score distortion due to rapid guessing (RG). Included studies group-administered a low-stakes cognitive assessment, identified RG via response times, and reported the rate of examinees engaging in RG, the percentage of RG responses observed, and/or the degree of score distortion in aggregated test scores due to RG. The final sample consisted of 25 studies and 39 independent samples comprised of 443,264 unique examinees. Results demonstrated that an average of 28.3% of examinees engaged in RG (21% were deemed to engage in RG on a nonnegligible number of items) and 6.89% of item responses were classified as rapid guesses. Across 100 effect sizes, RG was found to negatively distort aggregated test scores by an average of 0.13 standard deviations; however, this relationship was moderated by both test content area and filtering procedure.
期刊介绍:
Educational Assessment publishes original research and scholarship on the assessment of individuals, groups, and programs in educational settings. It includes theory, methodological approaches and empirical research in the appraisal of the learning and achievement of students and teachers, young children and adults, and novices and experts. The journal reports on current large-scale testing practices, discusses alternative approaches, presents scholarship on classroom assessment practices and includes assessment topics debated at the national level. It welcomes both conceptual and empirical pieces and encourages articles that provide a strong bridge between theory and/or empirical research and the implications for educational policy and/or practice.