Abby L Braitman, Anna M Petrey, Jennifer L Shipley, Rachel Ayala Guzman, Emily Renzoni, Alison Looby, Adrian J Bravo
{"title":"Check your data before you wreck your model: The impact of careless responding on substance use data quality.","authors":"Abby L Braitman, Anna M Petrey, Jennifer L Shipley, Rachel Ayala Guzman, Emily Renzoni, Alison Looby, Adrian J Bravo","doi":"10.1111/acer.70024","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>The accuracy of survey responses is a concern in research data quality, especially in college student samples. However, examination of the impact of removing participants from analyses who respond inaccurately or carelessly is warranted given the potential for loss of information or sample diversity. This study aimed to understand if careless responding varies across a number of demographic indices, substance use behaviors, and the timing of survey completion.</p><p><strong>Method: </strong>College students (N = 5809; 70.7% female; 75.7% White, non-Hispanic) enrolled in psychology classes from six universities completed an online survey assessing a variety of demographic and substance use-related information, which included four attention check questions dispersed throughout the hour-long survey. Differences in careless responding were assessed across multiple demographic groups, and we examined the impact of careless responding on data quality via a confirmatory factor analysis of a validated substance use measure, the Drinking Motives Questionnaire-Revised Short Form.</p><p><strong>Results: </strong>Careless responding varied significantly by participant race, sex, gender, sexual orientation, and socioeconomic status. Substance use was generally unassociated with careless responding, though careless responding was associated with experiencing more alcohol-related problems. Careless responding was more prevalent when the survey was completed near the end of the semester. Finally, the factor structure of the drinking motives measure was affected by the inclusion of those who failed two or more attention check questions.</p><p><strong>Conclusions: </strong>Including attention checks in surveys is an effective method to detect and address careless responding. However, omitting participants from analyses who evidence any careless responding may bias the sample demographics. We discuss recommendations for the use of attention check questions in undergraduate substance use cross-sectional surveys, including retaining participants who fail only one attention check, as this has a minimal impact on data quality while preserving sample diversity.</p>","PeriodicalId":72145,"journal":{"name":"Alcohol (Hanover, York County, Pa.)","volume":" ","pages":""},"PeriodicalIF":3.0000,"publicationDate":"2025-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Alcohol (Hanover, York County, Pa.)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1111/acer.70024","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"SUBSTANCE ABUSE","Score":null,"Total":0}
引用次数: 0
Abstract
Background: The accuracy of survey responses is a concern in research data quality, especially in college student samples. However, examination of the impact of removing participants from analyses who respond inaccurately or carelessly is warranted given the potential for loss of information or sample diversity. This study aimed to understand if careless responding varies across a number of demographic indices, substance use behaviors, and the timing of survey completion.
Method: College students (N = 5809; 70.7% female; 75.7% White, non-Hispanic) enrolled in psychology classes from six universities completed an online survey assessing a variety of demographic and substance use-related information, which included four attention check questions dispersed throughout the hour-long survey. Differences in careless responding were assessed across multiple demographic groups, and we examined the impact of careless responding on data quality via a confirmatory factor analysis of a validated substance use measure, the Drinking Motives Questionnaire-Revised Short Form.
Results: Careless responding varied significantly by participant race, sex, gender, sexual orientation, and socioeconomic status. Substance use was generally unassociated with careless responding, though careless responding was associated with experiencing more alcohol-related problems. Careless responding was more prevalent when the survey was completed near the end of the semester. Finally, the factor structure of the drinking motives measure was affected by the inclusion of those who failed two or more attention check questions.
Conclusions: Including attention checks in surveys is an effective method to detect and address careless responding. However, omitting participants from analyses who evidence any careless responding may bias the sample demographics. We discuss recommendations for the use of attention check questions in undergraduate substance use cross-sectional surveys, including retaining participants who fail only one attention check, as this has a minimal impact on data quality while preserving sample diversity.