Saskia Haug, Ivo Benke, Daniel Fischer, A. Maedche
{"title":"CrowdSurfer:将群体反馈任务无缝整合到日常互联网冲浪中","authors":"Saskia Haug, Ivo Benke, Daniel Fischer, A. Maedche","doi":"10.1145/3544548.3580994","DOIUrl":null,"url":null,"abstract":"Crowd feedback overcomes scalability issues of feedback collection on interactive website designs. However, collecting feedback on crowdsourcing platforms decouples the feedback provider from the context of use. This creates more effort for crowdworkers to immerse into such context in crowdsourcing tasks. In this paper, we present CrowdSurfer, a browser extension that seamlessly integrates design feedback collection in crowdworkers’ everyday internet surfing. This enables the scalable collection of in situ feedback and, in parallel, allows crowdworkers to flexibly integrate their work into their daily activities. In a field study, we compare the CrowdSurfer against traditional feedback collection. Our qualitative and quantitative results reveal that, while in situ feedback with the CrowdSurfer is not necessarily better, crowdworkers appreciate the effortless, enjoyable, and innovative method to conduct feedback tasks. We contribute with our findings on in situ feedback collection and provide recommendations for the integration of crowdworking tasks in everyday internet surfing.","PeriodicalId":314098,"journal":{"name":"Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"CrowdSurfer: Seamlessly Integrating Crowd-Feedback Tasks into Everyday Internet Surfing\",\"authors\":\"Saskia Haug, Ivo Benke, Daniel Fischer, A. Maedche\",\"doi\":\"10.1145/3544548.3580994\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Crowd feedback overcomes scalability issues of feedback collection on interactive website designs. However, collecting feedback on crowdsourcing platforms decouples the feedback provider from the context of use. This creates more effort for crowdworkers to immerse into such context in crowdsourcing tasks. In this paper, we present CrowdSurfer, a browser extension that seamlessly integrates design feedback collection in crowdworkers’ everyday internet surfing. This enables the scalable collection of in situ feedback and, in parallel, allows crowdworkers to flexibly integrate their work into their daily activities. In a field study, we compare the CrowdSurfer against traditional feedback collection. Our qualitative and quantitative results reveal that, while in situ feedback with the CrowdSurfer is not necessarily better, crowdworkers appreciate the effortless, enjoyable, and innovative method to conduct feedback tasks. We contribute with our findings on in situ feedback collection and provide recommendations for the integration of crowdworking tasks in everyday internet surfing.\",\"PeriodicalId\":314098,\"journal\":{\"name\":\"Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-04-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3544548.3580994\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3544548.3580994","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
CrowdSurfer: Seamlessly Integrating Crowd-Feedback Tasks into Everyday Internet Surfing
Crowd feedback overcomes scalability issues of feedback collection on interactive website designs. However, collecting feedback on crowdsourcing platforms decouples the feedback provider from the context of use. This creates more effort for crowdworkers to immerse into such context in crowdsourcing tasks. In this paper, we present CrowdSurfer, a browser extension that seamlessly integrates design feedback collection in crowdworkers’ everyday internet surfing. This enables the scalable collection of in situ feedback and, in parallel, allows crowdworkers to flexibly integrate their work into their daily activities. In a field study, we compare the CrowdSurfer against traditional feedback collection. Our qualitative and quantitative results reveal that, while in situ feedback with the CrowdSurfer is not necessarily better, crowdworkers appreciate the effortless, enjoyable, and innovative method to conduct feedback tasks. We contribute with our findings on in situ feedback collection and provide recommendations for the integration of crowdworking tasks in everyday internet surfing.