自动问责?隐私政策、数据透明度和第三方问题

IF 0.7 4区 社会学 Q2 LAW University of Toronto Law Journal Pub Date : 2021-12-09 DOI:10.3138/utlj-2020-0136
D. Lie, Lisa M. Austin, Peter Yi Ping Sun, Wen Qiu
{"title":"自动问责?隐私政策、数据透明度和第三方问题","authors":"D. Lie, Lisa M. Austin, Peter Yi Ping Sun, Wen Qiu","doi":"10.3138/utlj-2020-0136","DOIUrl":null,"url":null,"abstract":"Abstract:We have a data transparency problem. Currently, one of the main mechanisms we have to understand data flows is through the self-reporting that organizations provide through privacy policies. These suffer from many well-known problems, problems that are becoming more acute with the increasing complexity of the data ecosystem and the role of third parties – the affiliates, partners, processors, ad agencies, analytic services, and data brokers involved in the contemporary data practices of organizations. In this article, we argue that automating privacy policy analysis can improve the usability of privacy policies as a transparency mechanism. Our argument has five parts. First, we claim that we need to shift from thinking about privacy policies as a transparency mechanism that enhances consumer choice and see them as a transparency mechanism that enhances meaningful accountability. Second, we discuss a research tool that we prototyped, called AppTrans (for Application Transparency), which can detect inconsistencies between the declarations in a privacy policy and the actions the mobile application can potentially take if it is used. We used AppTrans to test seven hundred applications and found that 59.5 per cent were collecting data in ways that were not declared in their policies. The vast majority of the discrepancies were due to third party data collection such as advertising and analytics. Third, we outline the follow-on research we did to extend AppTrans to analyse the information sharing of mobile applications with third parties, with mixed results. Fourth, we situate our findings in relation to the third party issues that came to light in the recent Cambridge Analytica scandal and the calls from regulators for enhanced technical safeguards in managing these third party relationships. Fifth, we discuss some of the limitations of privacy policy automation as a strategy for enhanced data transparency and the policy implications of these limitations.","PeriodicalId":46289,"journal":{"name":"University of Toronto Law Journal","volume":"72 1","pages":"155 - 188"},"PeriodicalIF":0.7000,"publicationDate":"2021-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Automating accountability? Privacy policies, data transparency, and the third party problem\",\"authors\":\"D. Lie, Lisa M. Austin, Peter Yi Ping Sun, Wen Qiu\",\"doi\":\"10.3138/utlj-2020-0136\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract:We have a data transparency problem. Currently, one of the main mechanisms we have to understand data flows is through the self-reporting that organizations provide through privacy policies. These suffer from many well-known problems, problems that are becoming more acute with the increasing complexity of the data ecosystem and the role of third parties – the affiliates, partners, processors, ad agencies, analytic services, and data brokers involved in the contemporary data practices of organizations. In this article, we argue that automating privacy policy analysis can improve the usability of privacy policies as a transparency mechanism. Our argument has five parts. First, we claim that we need to shift from thinking about privacy policies as a transparency mechanism that enhances consumer choice and see them as a transparency mechanism that enhances meaningful accountability. Second, we discuss a research tool that we prototyped, called AppTrans (for Application Transparency), which can detect inconsistencies between the declarations in a privacy policy and the actions the mobile application can potentially take if it is used. We used AppTrans to test seven hundred applications and found that 59.5 per cent were collecting data in ways that were not declared in their policies. The vast majority of the discrepancies were due to third party data collection such as advertising and analytics. Third, we outline the follow-on research we did to extend AppTrans to analyse the information sharing of mobile applications with third parties, with mixed results. Fourth, we situate our findings in relation to the third party issues that came to light in the recent Cambridge Analytica scandal and the calls from regulators for enhanced technical safeguards in managing these third party relationships. Fifth, we discuss some of the limitations of privacy policy automation as a strategy for enhanced data transparency and the policy implications of these limitations.\",\"PeriodicalId\":46289,\"journal\":{\"name\":\"University of Toronto Law Journal\",\"volume\":\"72 1\",\"pages\":\"155 - 188\"},\"PeriodicalIF\":0.7000,\"publicationDate\":\"2021-12-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"University of Toronto Law Journal\",\"FirstCategoryId\":\"90\",\"ListUrlMain\":\"https://doi.org/10.3138/utlj-2020-0136\",\"RegionNum\":4,\"RegionCategory\":\"社会学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"LAW\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"University of Toronto Law Journal","FirstCategoryId":"90","ListUrlMain":"https://doi.org/10.3138/utlj-2020-0136","RegionNum":4,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"LAW","Score":null,"Total":0}
引用次数: 1

摘要

摘要:我们有一个数据透明度问题。目前,我们了解数据流的主要机制之一是通过组织通过隐私政策提供的自我报告。这些都存在许多众所周知的问题,随着数据生态系统的日益复杂和第三方的作用,这些问题变得越来越严重——参与组织当代数据实践的附属公司、合作伙伴、处理者、广告代理、分析服务和数据代理。在本文中,我们认为自动化隐私策略分析可以提高隐私策略作为透明机制的可用性。我们的论点有五个部分。首先,我们声称,我们需要从将隐私政策视为增强消费者选择的透明机制转变为将其视为增强有意义的问责制的透明机制。其次,我们讨论了一种我们原型化的研究工具,称为AppTrans(应用程序透明度),它可以检测隐私政策中的声明与移动应用程序在使用时可能采取的行动之间的不一致。我们使用AppTrans测试了700个应用程序,发现59.5%的应用程序以其政策中未声明的方式收集数据。绝大多数差异是由于广告和分析等第三方数据收集造成的。第三,我们概述了我们为扩展AppTrans而进行的后续研究,以分析移动应用程序与第三方的信息共享,结果喜忧参半。第四,我们将我们的调查结果与最近剑桥分析公司丑闻中曝光的第三方问题以及监管机构要求在管理这些第三方关系时加强技术保障的呼吁联系起来。第五,我们讨论了隐私政策自动化作为增强数据透明度的策略的一些局限性,以及这些局限性的政策含义。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Automating accountability? Privacy policies, data transparency, and the third party problem
Abstract:We have a data transparency problem. Currently, one of the main mechanisms we have to understand data flows is through the self-reporting that organizations provide through privacy policies. These suffer from many well-known problems, problems that are becoming more acute with the increasing complexity of the data ecosystem and the role of third parties – the affiliates, partners, processors, ad agencies, analytic services, and data brokers involved in the contemporary data practices of organizations. In this article, we argue that automating privacy policy analysis can improve the usability of privacy policies as a transparency mechanism. Our argument has five parts. First, we claim that we need to shift from thinking about privacy policies as a transparency mechanism that enhances consumer choice and see them as a transparency mechanism that enhances meaningful accountability. Second, we discuss a research tool that we prototyped, called AppTrans (for Application Transparency), which can detect inconsistencies between the declarations in a privacy policy and the actions the mobile application can potentially take if it is used. We used AppTrans to test seven hundred applications and found that 59.5 per cent were collecting data in ways that were not declared in their policies. The vast majority of the discrepancies were due to third party data collection such as advertising and analytics. Third, we outline the follow-on research we did to extend AppTrans to analyse the information sharing of mobile applications with third parties, with mixed results. Fourth, we situate our findings in relation to the third party issues that came to light in the recent Cambridge Analytica scandal and the calls from regulators for enhanced technical safeguards in managing these third party relationships. Fifth, we discuss some of the limitations of privacy policy automation as a strategy for enhanced data transparency and the policy implications of these limitations.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
1.70
自引率
16.70%
发文量
26
期刊最新文献
Joseph Heath, The Machinery of Government Ableism’s New Clothes: Achievements and Challenges for Disability Rights in Canada A Person Suffering: On Danger and Care in Mental Health Law Interpreting Dicey Against Moralism in Anti-Discrimination Law
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1