Matthew Price , Johanna E. Hidalgo , Julia N. Kim , Alison C. Legrand , Zoe M.F. Brier , Katherine van Stolk-Cooke , Amy Hughes Lansing , Ateka A. Contractor
{"title":"电子人方法:从众包数据中识别欺诈性回复的方法","authors":"Matthew Price , Johanna E. Hidalgo , Julia N. Kim , Alison C. Legrand , Zoe M.F. Brier , Katherine van Stolk-Cooke , Amy Hughes Lansing , Ateka A. Contractor","doi":"10.1016/j.chb.2024.108253","DOIUrl":null,"url":null,"abstract":"<div><p>Crowdsourcing is an essential data collection method for psychological research. Concerns about the validity and quality of crowdsourced data persist, however. A recent documented increase in the number of invalid responses within crowdsourced data has highlighted the need for quality control measures. Although a number of approaches are recommended, few have been empirically evaluated. The present study evaluated a Cyborg Method that used automated evaluation of participant meta-data and a review of short answer responses. Two samples were recruited – in the first, the Cyborg Method was applied after data collection to gauge the extent to which invalid responses were collected when <em>a priori</em> quality controls were absent. In the second, the Cyborg Method was applied during data collection to determine if the method would proactively screen invalid responses. Results suggested that Cyborg Method identified a substantial portion of invalid responses and both automated and human evaluation components w necessary. Furthermore, the Cyborg Method could be applied proactively to screen invalid responses and substantially reduced the per participant cost of data collection. These results suggest that the Cyborg Method is a promising means by which to collect high quality crowdsourced data.</p></div>","PeriodicalId":48471,"journal":{"name":"Computers in Human Behavior","volume":null,"pages":null},"PeriodicalIF":9.0000,"publicationDate":"2024-04-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The cyborg method: A method to identify fraudulent responses from crowdsourced data\",\"authors\":\"Matthew Price , Johanna E. Hidalgo , Julia N. Kim , Alison C. Legrand , Zoe M.F. Brier , Katherine van Stolk-Cooke , Amy Hughes Lansing , Ateka A. Contractor\",\"doi\":\"10.1016/j.chb.2024.108253\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Crowdsourcing is an essential data collection method for psychological research. Concerns about the validity and quality of crowdsourced data persist, however. A recent documented increase in the number of invalid responses within crowdsourced data has highlighted the need for quality control measures. Although a number of approaches are recommended, few have been empirically evaluated. The present study evaluated a Cyborg Method that used automated evaluation of participant meta-data and a review of short answer responses. Two samples were recruited – in the first, the Cyborg Method was applied after data collection to gauge the extent to which invalid responses were collected when <em>a priori</em> quality controls were absent. In the second, the Cyborg Method was applied during data collection to determine if the method would proactively screen invalid responses. Results suggested that Cyborg Method identified a substantial portion of invalid responses and both automated and human evaluation components w necessary. Furthermore, the Cyborg Method could be applied proactively to screen invalid responses and substantially reduced the per participant cost of data collection. These results suggest that the Cyborg Method is a promising means by which to collect high quality crowdsourced data.</p></div>\",\"PeriodicalId\":48471,\"journal\":{\"name\":\"Computers in Human Behavior\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":9.0000,\"publicationDate\":\"2024-04-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers in Human Behavior\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0747563224001213\",\"RegionNum\":1,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY, EXPERIMENTAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers in Human Behavior","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0747563224001213","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
The cyborg method: A method to identify fraudulent responses from crowdsourced data
Crowdsourcing is an essential data collection method for psychological research. Concerns about the validity and quality of crowdsourced data persist, however. A recent documented increase in the number of invalid responses within crowdsourced data has highlighted the need for quality control measures. Although a number of approaches are recommended, few have been empirically evaluated. The present study evaluated a Cyborg Method that used automated evaluation of participant meta-data and a review of short answer responses. Two samples were recruited – in the first, the Cyborg Method was applied after data collection to gauge the extent to which invalid responses were collected when a priori quality controls were absent. In the second, the Cyborg Method was applied during data collection to determine if the method would proactively screen invalid responses. Results suggested that Cyborg Method identified a substantial portion of invalid responses and both automated and human evaluation components w necessary. Furthermore, the Cyborg Method could be applied proactively to screen invalid responses and substantially reduced the per participant cost of data collection. These results suggest that the Cyborg Method is a promising means by which to collect high quality crowdsourced data.
期刊介绍:
Computers in Human Behavior is a scholarly journal that explores the psychological aspects of computer use. It covers original theoretical works, research reports, literature reviews, and software and book reviews. The journal examines both the use of computers in psychology, psychiatry, and related fields, and the psychological impact of computer use on individuals, groups, and society. Articles discuss topics such as professional practice, training, research, human development, learning, cognition, personality, and social interactions. It focuses on human interactions with computers, considering the computer as a medium through which human behaviors are shaped and expressed. Professionals interested in the psychological aspects of computer use will find this journal valuable, even with limited knowledge of computers.