{"title":"使用人匹配分析和反应风格模型识别反应风格","authors":"Stefanie A. Wind, Yuan Ge","doi":"10.1080/15366367.2022.2104565","DOIUrl":null,"url":null,"abstract":"In selected-response assessments such as attitude surveys with Likert-type rating scales, examinees often select from rating scale categories to reflect their locations on a construct. Researchers have observed that some examinees exhibit response styles, which are systematic patterns of responses in which examinees are more likely to select certain response categories, regardless of their locations on the construct (Baumgartner & Steenkamp, 2001; Paulhus, 1991; Roberts, 2016; Van Vaerenbergh & Thomas, 2013). For example, a midpoint response style occurs when examinees select middle rating scale categories most often, and an extreme response style occurs when examinees tend to select extreme categories most often. Response styles complicate the interpretation of examinee and item location estimates because responses may not fully reflect examinee locations on the construct. Accordingly, response styles can present a source of construct-irrelevant variance that threatens the validity of the interpretation and use of scores (American Educational Research Association [AERA], American Psychological Association [APA], & National Council on Measurement in Education [NCME], 2014). To identify and minimize construct-irrelevant impacts of response styles, researchers have proposed tools such as the Partial Credit Model – Response Style (PCMRS; Tutz et al., 2018) as an extension of the Partial credit model (PCM; Masters, 1982) to model the tendency for examinees to exhibit response styles. The PCMRS directly models response styles as a person-specific gamma parameter and corrects estimates of item difficulty for the presence of response styles. Specifically, the response style is treated as a random effect, where small distances between thresholds indicate a tendency to exhibit an extreme response style and widened distances between thresholds indicate a tendency to exhibit a midpoint response style. Thus far, most research on the PCMRS has focused on the presentation of the model and statistical software tools for estimating it (Schauberger, 2020, 2020; Tutz et al., 2018; Tutz & Schauberger, 2020). However, we identified one application of this approach in which Dibek (2020) employed the PCM and the PCMRS to data from the 2015 administration of the TIMSS assessment and detected the presence of response styles among student participants. Given the lack of prior research focusing on the interpretation and use of the PCMRS in applied survey research contexts, additional explorations are warranted. We describe details about the PCMRS model parameters and interpretation more detail later in the manuscript. Researchers have also used person fit analysis (Glas & Khalid, 2016) from models based on measurement frameworks with clear guidelines for identifying meaningful response patterns. For example, researchers have used the PCM, which falls within the Rasch measurement theory framework (Rasch, 1960) to identify examinees whose patterns of responses are different from what would be expected given their estimated location on the latent variable. Compared to the PCMRS approach, the Rasch measurement approach, as reflected in person fit analysis procedures with the PCM, focuses more on evaluating the interpretability of person estimates based on observed and expected response","PeriodicalId":46596,"journal":{"name":"Measurement-Interdisciplinary Research and Perspectives","volume":null,"pages":null},"PeriodicalIF":0.6000,"publicationDate":"2023-07-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Identifying Response Styles Using Person Fit Analysis and Response-Styles Models\",\"authors\":\"Stefanie A. Wind, Yuan Ge\",\"doi\":\"10.1080/15366367.2022.2104565\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In selected-response assessments such as attitude surveys with Likert-type rating scales, examinees often select from rating scale categories to reflect their locations on a construct. Researchers have observed that some examinees exhibit response styles, which are systematic patterns of responses in which examinees are more likely to select certain response categories, regardless of their locations on the construct (Baumgartner & Steenkamp, 2001; Paulhus, 1991; Roberts, 2016; Van Vaerenbergh & Thomas, 2013). For example, a midpoint response style occurs when examinees select middle rating scale categories most often, and an extreme response style occurs when examinees tend to select extreme categories most often. Response styles complicate the interpretation of examinee and item location estimates because responses may not fully reflect examinee locations on the construct. Accordingly, response styles can present a source of construct-irrelevant variance that threatens the validity of the interpretation and use of scores (American Educational Research Association [AERA], American Psychological Association [APA], & National Council on Measurement in Education [NCME], 2014). To identify and minimize construct-irrelevant impacts of response styles, researchers have proposed tools such as the Partial Credit Model – Response Style (PCMRS; Tutz et al., 2018) as an extension of the Partial credit model (PCM; Masters, 1982) to model the tendency for examinees to exhibit response styles. The PCMRS directly models response styles as a person-specific gamma parameter and corrects estimates of item difficulty for the presence of response styles. Specifically, the response style is treated as a random effect, where small distances between thresholds indicate a tendency to exhibit an extreme response style and widened distances between thresholds indicate a tendency to exhibit a midpoint response style. Thus far, most research on the PCMRS has focused on the presentation of the model and statistical software tools for estimating it (Schauberger, 2020, 2020; Tutz et al., 2018; Tutz & Schauberger, 2020). However, we identified one application of this approach in which Dibek (2020) employed the PCM and the PCMRS to data from the 2015 administration of the TIMSS assessment and detected the presence of response styles among student participants. Given the lack of prior research focusing on the interpretation and use of the PCMRS in applied survey research contexts, additional explorations are warranted. We describe details about the PCMRS model parameters and interpretation more detail later in the manuscript. Researchers have also used person fit analysis (Glas & Khalid, 2016) from models based on measurement frameworks with clear guidelines for identifying meaningful response patterns. For example, researchers have used the PCM, which falls within the Rasch measurement theory framework (Rasch, 1960) to identify examinees whose patterns of responses are different from what would be expected given their estimated location on the latent variable. Compared to the PCMRS approach, the Rasch measurement approach, as reflected in person fit analysis procedures with the PCM, focuses more on evaluating the interpretability of person estimates based on observed and expected response\",\"PeriodicalId\":46596,\"journal\":{\"name\":\"Measurement-Interdisciplinary Research and Perspectives\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.6000,\"publicationDate\":\"2023-07-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Measurement-Interdisciplinary Research and Perspectives\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1080/15366367.2022.2104565\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"SOCIAL SCIENCES, INTERDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Measurement-Interdisciplinary Research and Perspectives","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/15366367.2022.2104565","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"SOCIAL SCIENCES, INTERDISCIPLINARY","Score":null,"Total":0}
Identifying Response Styles Using Person Fit Analysis and Response-Styles Models
In selected-response assessments such as attitude surveys with Likert-type rating scales, examinees often select from rating scale categories to reflect their locations on a construct. Researchers have observed that some examinees exhibit response styles, which are systematic patterns of responses in which examinees are more likely to select certain response categories, regardless of their locations on the construct (Baumgartner & Steenkamp, 2001; Paulhus, 1991; Roberts, 2016; Van Vaerenbergh & Thomas, 2013). For example, a midpoint response style occurs when examinees select middle rating scale categories most often, and an extreme response style occurs when examinees tend to select extreme categories most often. Response styles complicate the interpretation of examinee and item location estimates because responses may not fully reflect examinee locations on the construct. Accordingly, response styles can present a source of construct-irrelevant variance that threatens the validity of the interpretation and use of scores (American Educational Research Association [AERA], American Psychological Association [APA], & National Council on Measurement in Education [NCME], 2014). To identify and minimize construct-irrelevant impacts of response styles, researchers have proposed tools such as the Partial Credit Model – Response Style (PCMRS; Tutz et al., 2018) as an extension of the Partial credit model (PCM; Masters, 1982) to model the tendency for examinees to exhibit response styles. The PCMRS directly models response styles as a person-specific gamma parameter and corrects estimates of item difficulty for the presence of response styles. Specifically, the response style is treated as a random effect, where small distances between thresholds indicate a tendency to exhibit an extreme response style and widened distances between thresholds indicate a tendency to exhibit a midpoint response style. Thus far, most research on the PCMRS has focused on the presentation of the model and statistical software tools for estimating it (Schauberger, 2020, 2020; Tutz et al., 2018; Tutz & Schauberger, 2020). However, we identified one application of this approach in which Dibek (2020) employed the PCM and the PCMRS to data from the 2015 administration of the TIMSS assessment and detected the presence of response styles among student participants. Given the lack of prior research focusing on the interpretation and use of the PCMRS in applied survey research contexts, additional explorations are warranted. We describe details about the PCMRS model parameters and interpretation more detail later in the manuscript. Researchers have also used person fit analysis (Glas & Khalid, 2016) from models based on measurement frameworks with clear guidelines for identifying meaningful response patterns. For example, researchers have used the PCM, which falls within the Rasch measurement theory framework (Rasch, 1960) to identify examinees whose patterns of responses are different from what would be expected given their estimated location on the latent variable. Compared to the PCMRS approach, the Rasch measurement approach, as reflected in person fit analysis procedures with the PCM, focuses more on evaluating the interpretability of person estimates based on observed and expected response