Aaron J. Myers, Allison J. Ames, B. Leventhal, Madison A. Holzman
{"title":"验证标题评分过程:项目反应树模型的应用","authors":"Aaron J. Myers, Allison J. Ames, B. Leventhal, Madison A. Holzman","doi":"10.1080/08957347.2020.1789143","DOIUrl":null,"url":null,"abstract":"ABSTRACT When rating performance assessments, raters may ascribe different scores for the same performance when rubric application does not align with the intended application of the scoring criteria. Given performance assessment score interpretation assumes raters apply rubrics as rubric developers intended, misalignment between raters’ scoring processes and the intended scoring processes may lead to invalid inferences from these scores. In an effort to standardize raters’ scoring processes, an alternative scoring method was used. With this method, rubric developers’ intended scoring processes are made explicit by requiring raters to respond to a series of selected-response statements resembling a decision tree. To determine if raters scored essays as intended using a traditional rubric and the alternative scoring method, an IRT model with a tree-like structure (IRTree) was specified to depict the intended scoring processes and fit to data from each scoring method. Results suggest raters using the alternative method may be better able to rate as intended and thus the alternative method may be a viable alternative to traditional rubric scoring. Implications of the IRTree model are discussed.","PeriodicalId":51609,"journal":{"name":"Applied Measurement in Education","volume":"33 1","pages":"293 - 308"},"PeriodicalIF":1.1000,"publicationDate":"2020-07-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/08957347.2020.1789143","citationCount":"6","resultStr":"{\"title\":\"Validating Rubric Scoring Processes: An Application of an Item Response Tree Model\",\"authors\":\"Aaron J. Myers, Allison J. Ames, B. Leventhal, Madison A. Holzman\",\"doi\":\"10.1080/08957347.2020.1789143\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"ABSTRACT When rating performance assessments, raters may ascribe different scores for the same performance when rubric application does not align with the intended application of the scoring criteria. Given performance assessment score interpretation assumes raters apply rubrics as rubric developers intended, misalignment between raters’ scoring processes and the intended scoring processes may lead to invalid inferences from these scores. In an effort to standardize raters’ scoring processes, an alternative scoring method was used. With this method, rubric developers’ intended scoring processes are made explicit by requiring raters to respond to a series of selected-response statements resembling a decision tree. To determine if raters scored essays as intended using a traditional rubric and the alternative scoring method, an IRT model with a tree-like structure (IRTree) was specified to depict the intended scoring processes and fit to data from each scoring method. Results suggest raters using the alternative method may be better able to rate as intended and thus the alternative method may be a viable alternative to traditional rubric scoring. Implications of the IRTree model are discussed.\",\"PeriodicalId\":51609,\"journal\":{\"name\":\"Applied Measurement in Education\",\"volume\":\"33 1\",\"pages\":\"293 - 308\"},\"PeriodicalIF\":1.1000,\"publicationDate\":\"2020-07-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1080/08957347.2020.1789143\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Applied Measurement in Education\",\"FirstCategoryId\":\"95\",\"ListUrlMain\":\"https://doi.org/10.1080/08957347.2020.1789143\",\"RegionNum\":4,\"RegionCategory\":\"教育学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"EDUCATION & EDUCATIONAL RESEARCH\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Measurement in Education","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1080/08957347.2020.1789143","RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
Validating Rubric Scoring Processes: An Application of an Item Response Tree Model
ABSTRACT When rating performance assessments, raters may ascribe different scores for the same performance when rubric application does not align with the intended application of the scoring criteria. Given performance assessment score interpretation assumes raters apply rubrics as rubric developers intended, misalignment between raters’ scoring processes and the intended scoring processes may lead to invalid inferences from these scores. In an effort to standardize raters’ scoring processes, an alternative scoring method was used. With this method, rubric developers’ intended scoring processes are made explicit by requiring raters to respond to a series of selected-response statements resembling a decision tree. To determine if raters scored essays as intended using a traditional rubric and the alternative scoring method, an IRT model with a tree-like structure (IRTree) was specified to depict the intended scoring processes and fit to data from each scoring method. Results suggest raters using the alternative method may be better able to rate as intended and thus the alternative method may be a viable alternative to traditional rubric scoring. Implications of the IRTree model are discussed.
期刊介绍:
Because interaction between the domains of research and application is critical to the evaluation and improvement of new educational measurement practices, Applied Measurement in Education" prime objective is to improve communication between academicians and practitioners. To help bridge the gap between theory and practice, articles in this journal describe original research studies, innovative strategies for solving educational measurement problems, and integrative reviews of current approaches to contemporary measurement issues. Peer Review Policy: All review papers in this journal have undergone editorial screening and peer review.