{"title":"Do Extra Dollars Pay Off? - An Exploratory Study on TopCoder","authors":"Lili Wang, Yong Wang","doi":"10.1145/3195863.3196958","DOIUrl":null,"url":null,"abstract":"In general crowdsourcing, different task requesters employ different pricing strategies to balance task cost and expected worker performance. While most existing studies show that increasing incentives tend to benefit crowdsourcing outcomes, i.e. broader participation and higher worker performance, some reported inconsistent observations. In addition, there is the lack of investigation in the domain of software crowdsourcing. To that end, this study examines the extent to which task pricing strategies are employed in software crowdsourcing. More specifically, it aims at investigating the impact of pricing strategies on worker’s behaviors and performance. It reports a conceptual model between pricing strategies and potential influences on worker behaviors, an algorithm for measuring the effect of pricing strategies, and an empirical evaluation on 434 crowdsourcing tasks extracted from TopCoder. The results show that: 1) Strategic task pricing patterns, i.e. under-pricing and over-pricing are prevalent in software crowdsourcing practices; 2) Overpriced tasks are more likely to attract more workers to register and submit, and have higher task completion velocity; 3) Underpriced tasks tend to associate with less registrants and submissions, and lower task completion velocity. These observations imply that task requesters can typically get their extra dollars investment paid-off if employing proactive task pricing strategy. However, it is also observed that it appears to be a counter-intuitive effect on the score of final deliverable. We believe the preliminary findings are helpful for task requesters in better pricing decision and hope to stimulate further discussions and research in pricing strategies of software crowdsourcing.","PeriodicalId":131063,"journal":{"name":"2018 IEEE/ACM 5th International Workshop on Crowd Sourcing in Software Engineering (CSI-SE)","volume":"630 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-05-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE/ACM 5th International Workshop on Crowd Sourcing in Software Engineering (CSI-SE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3195863.3196958","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
In general crowdsourcing, different task requesters employ different pricing strategies to balance task cost and expected worker performance. While most existing studies show that increasing incentives tend to benefit crowdsourcing outcomes, i.e. broader participation and higher worker performance, some reported inconsistent observations. In addition, there is the lack of investigation in the domain of software crowdsourcing. To that end, this study examines the extent to which task pricing strategies are employed in software crowdsourcing. More specifically, it aims at investigating the impact of pricing strategies on worker’s behaviors and performance. It reports a conceptual model between pricing strategies and potential influences on worker behaviors, an algorithm for measuring the effect of pricing strategies, and an empirical evaluation on 434 crowdsourcing tasks extracted from TopCoder. The results show that: 1) Strategic task pricing patterns, i.e. under-pricing and over-pricing are prevalent in software crowdsourcing practices; 2) Overpriced tasks are more likely to attract more workers to register and submit, and have higher task completion velocity; 3) Underpriced tasks tend to associate with less registrants and submissions, and lower task completion velocity. These observations imply that task requesters can typically get their extra dollars investment paid-off if employing proactive task pricing strategy. However, it is also observed that it appears to be a counter-intuitive effect on the score of final deliverable. We believe the preliminary findings are helpful for task requesters in better pricing decision and hope to stimulate further discussions and research in pricing strategies of software crowdsourcing.