{"title":"How a Few Inconsistent Respondents Can Confound the Structure of Personality Survey Data","authors":"V. Arias, Fernando P. Ponce, A. Martínez-Molina","doi":"10.1027/1015-5759/a000719","DOIUrl":null,"url":null,"abstract":"Abstract. In survey data, inconsistent responses due to careless/insufficient effort (C/IE) can lead to problems of replicability and validity. However, data cleaning prior to the main analyses is not yet a standard practice. We investigated the effect of C/IE responses on the structure of personality survey data. For this purpose, we analyzed the structure of the Core-Self Evaluations scale (CSE-S), including the detection of aberrant responses in the study design. While the original theoretical model of the CSE-S assumes that the construct is unidimensional ( Judge et al., 2003 ), recent studies have argued for a multidimensional solution (positive CSE and negative CSE). We hypothesized that this multidimensionality is not substantive but a result of the tendency of C/IE data to generate spurious dimensions. We estimated the confirmatory models before and after removing highly inconsistent response vectors in two independent samples (6% and 4.7%). The analysis of the raw samples clearly favored retaining the two-dimensional model. In contrast, the analysis of the clean datasets suggested the retention of a single factor. A mere 6% C/IE response rate showed enough power to confound the results of the factor analysis. This result suggests that the factor structure of positive and negative CSE factors is spurious, resulting from uncontrolled wording variance produced by a limited proportion of highly inconsistent response vectors. We encourage researchers to include screening for inconsistent responses in their research designs.","PeriodicalId":48018,"journal":{"name":"European Journal of Psychological Assessment","volume":" ","pages":""},"PeriodicalIF":3.2000,"publicationDate":"2022-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"European Journal of Psychological Assessment","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1027/1015-5759/a000719","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PSYCHOLOGY, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
Abstract. In survey data, inconsistent responses due to careless/insufficient effort (C/IE) can lead to problems of replicability and validity. However, data cleaning prior to the main analyses is not yet a standard practice. We investigated the effect of C/IE responses on the structure of personality survey data. For this purpose, we analyzed the structure of the Core-Self Evaluations scale (CSE-S), including the detection of aberrant responses in the study design. While the original theoretical model of the CSE-S assumes that the construct is unidimensional ( Judge et al., 2003 ), recent studies have argued for a multidimensional solution (positive CSE and negative CSE). We hypothesized that this multidimensionality is not substantive but a result of the tendency of C/IE data to generate spurious dimensions. We estimated the confirmatory models before and after removing highly inconsistent response vectors in two independent samples (6% and 4.7%). The analysis of the raw samples clearly favored retaining the two-dimensional model. In contrast, the analysis of the clean datasets suggested the retention of a single factor. A mere 6% C/IE response rate showed enough power to confound the results of the factor analysis. This result suggests that the factor structure of positive and negative CSE factors is spurious, resulting from uncontrolled wording variance produced by a limited proportion of highly inconsistent response vectors. We encourage researchers to include screening for inconsistent responses in their research designs.
期刊介绍:
The main purpose of the EJPA is to present important articles which provide seminal information on both theoretical and applied developments in this field. Articles reporting the construction of new measures or an advancement of an existing measure are given priority. The journal is directed to practitioners as well as to academicians: The conviction of its editors is that the discipline of psychological assessment should, necessarily and firmly, be attached to the roots of psychological science, while going deeply into all the consequences of its applied, practice-oriented development.