Steven Zimmerman, Alistair Thorpe, C. Fox, Udo Kruschwitz
{"title":"搜索中的隐私推动:调查潜在影响","authors":"Steven Zimmerman, Alistair Thorpe, C. Fox, Udo Kruschwitz","doi":"10.1145/3295750.3298952","DOIUrl":null,"url":null,"abstract":"From their impacts to potential threats, privacy and misinformation are a recurring top news story. Social media platforms (e.g. Facebook) and information retrieval (IR) systems (e.g. Google), are now in the public spotlight to address these issues. Our research investigates an approach, known as Nudging, applied to the domain of IR, as a potential means to minimize impacts and threats surrounding both matters. We perform our study in the space of health search for two reasons. First, encounters with misinformation in this space have potentially grave outcomes. Second, there are many potential threats to personal privacy as a result of the data collected during a search task. Adopting methods and a corpus from previous work as the foundation, our study asked users to determine the effectiveness of a treatment for 10 medical conditions. Users performed the tasks on 4 variants of a search engine results page (SERP) and a control, with 3 of the SERP's being a Nudge (re-ranking, filtering and a visual cue) intended to reduce impacts to privacy with minimal impact to search result quality. The aim of our work is to determine the Nudge that is least impactful to good decision making while simultaneously increasing privacy protection. We find privacy impacts are significantly reduced for the re-ranking and filtering strategies, with no significant impacts on quality of decision making.","PeriodicalId":187771,"journal":{"name":"Proceedings of the 2019 Conference on Human Information Interaction and Retrieval","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":"{\"title\":\"Privacy Nudging in Search: Investigating Potential Impacts\",\"authors\":\"Steven Zimmerman, Alistair Thorpe, C. Fox, Udo Kruschwitz\",\"doi\":\"10.1145/3295750.3298952\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"From their impacts to potential threats, privacy and misinformation are a recurring top news story. Social media platforms (e.g. Facebook) and information retrieval (IR) systems (e.g. Google), are now in the public spotlight to address these issues. Our research investigates an approach, known as Nudging, applied to the domain of IR, as a potential means to minimize impacts and threats surrounding both matters. We perform our study in the space of health search for two reasons. First, encounters with misinformation in this space have potentially grave outcomes. Second, there are many potential threats to personal privacy as a result of the data collected during a search task. Adopting methods and a corpus from previous work as the foundation, our study asked users to determine the effectiveness of a treatment for 10 medical conditions. Users performed the tasks on 4 variants of a search engine results page (SERP) and a control, with 3 of the SERP's being a Nudge (re-ranking, filtering and a visual cue) intended to reduce impacts to privacy with minimal impact to search result quality. The aim of our work is to determine the Nudge that is least impactful to good decision making while simultaneously increasing privacy protection. We find privacy impacts are significantly reduced for the re-ranking and filtering strategies, with no significant impacts on quality of decision making.\",\"PeriodicalId\":187771,\"journal\":{\"name\":\"Proceedings of the 2019 Conference on Human Information Interaction and Retrieval\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-03-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"16\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2019 Conference on Human Information Interaction and Retrieval\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3295750.3298952\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2019 Conference on Human Information Interaction and Retrieval","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3295750.3298952","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Privacy Nudging in Search: Investigating Potential Impacts
From their impacts to potential threats, privacy and misinformation are a recurring top news story. Social media platforms (e.g. Facebook) and information retrieval (IR) systems (e.g. Google), are now in the public spotlight to address these issues. Our research investigates an approach, known as Nudging, applied to the domain of IR, as a potential means to minimize impacts and threats surrounding both matters. We perform our study in the space of health search for two reasons. First, encounters with misinformation in this space have potentially grave outcomes. Second, there are many potential threats to personal privacy as a result of the data collected during a search task. Adopting methods and a corpus from previous work as the foundation, our study asked users to determine the effectiveness of a treatment for 10 medical conditions. Users performed the tasks on 4 variants of a search engine results page (SERP) and a control, with 3 of the SERP's being a Nudge (re-ranking, filtering and a visual cue) intended to reduce impacts to privacy with minimal impact to search result quality. The aim of our work is to determine the Nudge that is least impactful to good decision making while simultaneously increasing privacy protection. We find privacy impacts are significantly reduced for the re-ranking and filtering strategies, with no significant impacts on quality of decision making.