{"title":"Rationale and Study Checklist for Ethical Rejection of Participants on Crowdsourcing Research Platforms","authors":"Jon Agley, Casey Mumaw, Bethany Johnson","doi":"10.1002/eahr.500217","DOIUrl":null,"url":null,"abstract":"<div>\n \n <p>Online participant recruitment (“crowdsourcing”) platforms are increasingly being used for research studies. While such platforms can rapidly provide access to large samples, there are concomitant concerns around data quality. Researchers have studied and demonstrated means to reduce the prevalence of low-quality data from crowdsourcing platforms, but approaches to doing so often involve rejecting work and/or denying payment to participants, which can pose ethical dilemmas. We write this essay as an associate professor and two institutional review board (IRB) directors to provide a perspective on the competing interests of participants/workers and researchers and to propose a checklist of steps that we believe may support workers' agency on the platform and lessen instances of unfair consequences to them while enabling researchers to definitively reject lower-quality work that might otherwise reduce the likelihood of their studies producing true results. We encourage further, explicit discussion of these issues among academics and among IRBs.</p>\n </div>","PeriodicalId":36829,"journal":{"name":"Ethics & human research","volume":"46 4","pages":"38-46"},"PeriodicalIF":0.0000,"publicationDate":"2024-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/eahr.500217","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Ethics & human research","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/eahr.500217","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Social Sciences","Score":null,"Total":0}
引用次数: 0
Abstract
Online participant recruitment (“crowdsourcing”) platforms are increasingly being used for research studies. While such platforms can rapidly provide access to large samples, there are concomitant concerns around data quality. Researchers have studied and demonstrated means to reduce the prevalence of low-quality data from crowdsourcing platforms, but approaches to doing so often involve rejecting work and/or denying payment to participants, which can pose ethical dilemmas. We write this essay as an associate professor and two institutional review board (IRB) directors to provide a perspective on the competing interests of participants/workers and researchers and to propose a checklist of steps that we believe may support workers' agency on the platform and lessen instances of unfair consequences to them while enabling researchers to definitively reject lower-quality work that might otherwise reduce the likelihood of their studies producing true results. We encourage further, explicit discussion of these issues among academics and among IRBs.