{"title":"Microworkers vs. facebook: The impact of crowdsourcing platform choice on experimental results","authors":"B. Gardlo, M. Ries, T. Hossfeld, R. Schatz","doi":"10.1109/QoMEX.2012.6263885","DOIUrl":null,"url":null,"abstract":"Subjective laboratory tests represent a proven, reliable approach towards multimedia quality assessment. Nonetheless, in certain cases novel progressive quality of experience (QoE) assessment methods can lead to better results or enable test execution in more cost-effective ways. In this respect, crowdsourcing can be considered as emerging method enabling researchers to better explore end-user quality perception when requiring a large panel of subjects, particularly for Web application usage scenarios. However, the crowdsourcing platform chosen for recruiting participants can have an impact on the experimental results. In this paper, we examine the platform's influence on QoE results by comparing MOS scores of two otherwise identical subjective HD video quality experiments executed on one paid and one non-paid crowdsourcing platform.","PeriodicalId":6303,"journal":{"name":"2012 Fourth International Workshop on Quality of Multimedia Experience","volume":"95 1","pages":"35-36"},"PeriodicalIF":0.0000,"publicationDate":"2012-07-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"20","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 Fourth International Workshop on Quality of Multimedia Experience","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/QoMEX.2012.6263885","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 20
Abstract
Subjective laboratory tests represent a proven, reliable approach towards multimedia quality assessment. Nonetheless, in certain cases novel progressive quality of experience (QoE) assessment methods can lead to better results or enable test execution in more cost-effective ways. In this respect, crowdsourcing can be considered as emerging method enabling researchers to better explore end-user quality perception when requiring a large panel of subjects, particularly for Web application usage scenarios. However, the crowdsourcing platform chosen for recruiting participants can have an impact on the experimental results. In this paper, we examine the platform's influence on QoE results by comparing MOS scores of two otherwise identical subjective HD video quality experiments executed on one paid and one non-paid crowdsourcing platform.