{"title":"Teacher Assessment Literacy: Implications for Diagnostic Assessment Systems","authors":"Amy K. Clark, Brooke L. Nash, Meagan Karvonen","doi":"10.1080/08957347.2022.2034823","DOIUrl":null,"url":null,"abstract":"ABSTRACT Assessments scored with diagnostic models are increasingly popular because they provide fine-grained information about student achievement. Because of differences in how diagnostic assessments are scored and how results are used, the information teachers must know to interpret and use results may differ from concepts traditionally included in assessment literacy trainings for assessments that produce a raw or scale score. In this study, we connect assessment literacy and score reporting literature to understand teachers’ assessment literacy in a diagnostic assessment context as demonstrated by responses to focus groups and surveys. Results summarize teachers’ descriptions of fundamental diagnostic assessment concepts, understanding of the diagnostic assessment and results produced, and how diagnostic assessment results influence their instructional decision-making. Teachers understood how to use results and were comfortable using the term mastery when interpreting score report contents and planning next instruction. However, teachers were unsure how mastery was calculated and some misinterpreted mastery as representing a percent correct rather than a probability value. We share implications for others implementing large-scale diagnostic assessments or designing score reports for these systems.","PeriodicalId":51609,"journal":{"name":"Applied Measurement in Education","volume":"35 1","pages":"17 - 32"},"PeriodicalIF":1.1000,"publicationDate":"2022-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Measurement in Education","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1080/08957347.2022.2034823","RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 4
Abstract
ABSTRACT Assessments scored with diagnostic models are increasingly popular because they provide fine-grained information about student achievement. Because of differences in how diagnostic assessments are scored and how results are used, the information teachers must know to interpret and use results may differ from concepts traditionally included in assessment literacy trainings for assessments that produce a raw or scale score. In this study, we connect assessment literacy and score reporting literature to understand teachers’ assessment literacy in a diagnostic assessment context as demonstrated by responses to focus groups and surveys. Results summarize teachers’ descriptions of fundamental diagnostic assessment concepts, understanding of the diagnostic assessment and results produced, and how diagnostic assessment results influence their instructional decision-making. Teachers understood how to use results and were comfortable using the term mastery when interpreting score report contents and planning next instruction. However, teachers were unsure how mastery was calculated and some misinterpreted mastery as representing a percent correct rather than a probability value. We share implications for others implementing large-scale diagnostic assessments or designing score reports for these systems.
期刊介绍:
Because interaction between the domains of research and application is critical to the evaluation and improvement of new educational measurement practices, Applied Measurement in Education" prime objective is to improve communication between academicians and practitioners. To help bridge the gap between theory and practice, articles in this journal describe original research studies, innovative strategies for solving educational measurement problems, and integrative reviews of current approaches to contemporary measurement issues. Peer Review Policy: All review papers in this journal have undergone editorial screening and peer review.