Michele A. Saad, Patrick McKnight, Jake Quartuccio, David G. Nicholas, Ramesh Jaladi, P. Corriveau
{"title":"消费者照片质量评估:众包的挑战和陷阱","authors":"Michele A. Saad, Patrick McKnight, Jake Quartuccio, David G. Nicholas, Ramesh Jaladi, P. Corriveau","doi":"10.1109/QoMEX.2016.7498970","DOIUrl":null,"url":null,"abstract":"We discuss caveats and challenges in crowdsourcing subjective image quality evaluation studies that use real consumer photos with non-simulated distortions. The subtle nature of consumer image artifacts as well as the nuanced quality differences among photos from various consumer devices, necessitate that crowdsourcing studies be designed with extra caution. In this work, we point out certain caveats to look for in the literature, and we draw attention to and discuss the consequences of various design choices on the subjective responses received. The design choices include: 1) stimulus viewing mode, 2) stimulus habituation, 3) study length, and 4) the location of a number of attention items within the study. We show how these design parameters are key to maximizing response correspondence with lab-based responses, and we emphasize to how this differs from tests that utilize simulated image distortions.","PeriodicalId":6645,"journal":{"name":"2016 Eighth International Conference on Quality of Multimedia Experience (QoMEX)","volume":"333 1","pages":"1-6"},"PeriodicalIF":0.0000,"publicationDate":"2016-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Consumer-photo quality assessment: Challenges and pitfalls in crowdsourcing\",\"authors\":\"Michele A. Saad, Patrick McKnight, Jake Quartuccio, David G. Nicholas, Ramesh Jaladi, P. Corriveau\",\"doi\":\"10.1109/QoMEX.2016.7498970\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We discuss caveats and challenges in crowdsourcing subjective image quality evaluation studies that use real consumer photos with non-simulated distortions. The subtle nature of consumer image artifacts as well as the nuanced quality differences among photos from various consumer devices, necessitate that crowdsourcing studies be designed with extra caution. In this work, we point out certain caveats to look for in the literature, and we draw attention to and discuss the consequences of various design choices on the subjective responses received. The design choices include: 1) stimulus viewing mode, 2) stimulus habituation, 3) study length, and 4) the location of a number of attention items within the study. We show how these design parameters are key to maximizing response correspondence with lab-based responses, and we emphasize to how this differs from tests that utilize simulated image distortions.\",\"PeriodicalId\":6645,\"journal\":{\"name\":\"2016 Eighth International Conference on Quality of Multimedia Experience (QoMEX)\",\"volume\":\"333 1\",\"pages\":\"1-6\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-06-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 Eighth International Conference on Quality of Multimedia Experience (QoMEX)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/QoMEX.2016.7498970\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 Eighth International Conference on Quality of Multimedia Experience (QoMEX)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/QoMEX.2016.7498970","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Consumer-photo quality assessment: Challenges and pitfalls in crowdsourcing
We discuss caveats and challenges in crowdsourcing subjective image quality evaluation studies that use real consumer photos with non-simulated distortions. The subtle nature of consumer image artifacts as well as the nuanced quality differences among photos from various consumer devices, necessitate that crowdsourcing studies be designed with extra caution. In this work, we point out certain caveats to look for in the literature, and we draw attention to and discuss the consequences of various design choices on the subjective responses received. The design choices include: 1) stimulus viewing mode, 2) stimulus habituation, 3) study length, and 4) the location of a number of attention items within the study. We show how these design parameters are key to maximizing response correspondence with lab-based responses, and we emphasize to how this differs from tests that utilize simulated image distortions.