{"title":"Evaluating Psychometric Differences Between Fast Versus Slow Responses on Rating Scale Items","authors":"N. Kim, D. Bolt","doi":"10.3102/10769986231195260","DOIUrl":null,"url":null,"abstract":"Some previous studies suggest that response times (RTs) on rating scale items can be informative about the content trait, but a more recent study suggests they may also be reflective of response styles. The latter result raises questions about the possible consideration of RTs for content trait estimation, as response styles are generally viewed as nuisance dimensions in the measurement of noncognitive constructs. In this article, we extend previous work exploring the simultaneous relevance of content and response style traits on RTs in self-report rating scale measurement by examining psychometric differences related to fast versus slow item responses. Following a parallel methodology applied with cognitive measures, we provide empirical illustrations of how RTs appear to be simultaneously reflective of both content and response style traits. Our results demonstrate that respondents may exhibit different response behaviors for fast versus slow responses and that both the content trait and response styles are relevant to such heterogeneity. These findings suggest that using RTs as a basis for improving the estimation of noncognitive constructs likely requires simultaneously attending to the effects of response styles.","PeriodicalId":48001,"journal":{"name":"Journal of Educational and Behavioral Statistics","volume":" ","pages":""},"PeriodicalIF":1.9000,"publicationDate":"2023-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Educational and Behavioral Statistics","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.3102/10769986231195260","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0
Abstract
Some previous studies suggest that response times (RTs) on rating scale items can be informative about the content trait, but a more recent study suggests they may also be reflective of response styles. The latter result raises questions about the possible consideration of RTs for content trait estimation, as response styles are generally viewed as nuisance dimensions in the measurement of noncognitive constructs. In this article, we extend previous work exploring the simultaneous relevance of content and response style traits on RTs in self-report rating scale measurement by examining psychometric differences related to fast versus slow item responses. Following a parallel methodology applied with cognitive measures, we provide empirical illustrations of how RTs appear to be simultaneously reflective of both content and response style traits. Our results demonstrate that respondents may exhibit different response behaviors for fast versus slow responses and that both the content trait and response styles are relevant to such heterogeneity. These findings suggest that using RTs as a basis for improving the estimation of noncognitive constructs likely requires simultaneously attending to the effects of response styles.
期刊介绍:
Journal of Educational and Behavioral Statistics, sponsored jointly by the American Educational Research Association and the American Statistical Association, publishes articles that are original and provide methods that are useful to those studying problems and issues in educational or behavioral research. Typical papers introduce new methods of analysis. Critical reviews of current practice, tutorial presentations of less well known methods, and novel applications of already-known methods are also of interest. Papers discussing statistical techniques without specific educational or behavioral interest or focusing on substantive results without developing new statistical methods or models or making novel use of existing methods have lower priority. Simulation studies, either to demonstrate properties of an existing method or to compare several existing methods (without providing a new method), also have low priority. The Journal of Educational and Behavioral Statistics provides an outlet for papers that are original and provide methods that are useful to those studying problems and issues in educational or behavioral research. Typical papers introduce new methods of analysis, provide properties of these methods, and an example of use in education or behavioral research. Critical reviews of current practice, tutorial presentations of less well known methods, and novel applications of already-known methods are also sometimes accepted. Papers discussing statistical techniques without specific educational or behavioral interest or focusing on substantive results without developing new statistical methods or models or making novel use of existing methods have lower priority. Simulation studies, either to demonstrate properties of an existing method or to compare several existing methods (without providing a new method), also have low priority.