差分隐私:选择Epsilon的一种经济方法

Justin Hsu, Marco Gaboardi, Andreas Haeberlen, S. Khanna, Arjun Narayan, B. Pierce, Aaron Roth
{"title":"差分隐私:选择Epsilon的一种经济方法","authors":"Justin Hsu, Marco Gaboardi, Andreas Haeberlen, S. Khanna, Arjun Narayan, B. Pierce, Aaron Roth","doi":"10.1109/CSF.2014.35","DOIUrl":null,"url":null,"abstract":"Differential privacy is becoming a gold standard notion of privacy; it offers a guaranteed bound on loss of privacy due to release of query results, even under worst-case assumptions. The theory of differential privacy is an active research area, and there are now differentially private algorithms for a wide range of problems. However, the question of when differential privacy works in practice has received relatively little attention. In particular, there is still no rigorous method for choosing the key parameter ε, which controls the crucial tradeoff between the strength of the privacy guarantee and the accuracy of the published results. In this paper, we examine the role of these parameters in concrete applications, identifying the key considerations that must be addressed when choosing specific values. This choice requires balancing the interests of two parties with conflicting objectives: the data analyst, who wishes to learn something abou the data, and the prospective participant, who must decide whether to allow their data to be included in the analysis. We propose a simple model that expresses this balance as formulas over a handful of parameters, and we use our model to choose ε on a series of simple statistical studies. We also explore a surprising insight: in some circumstances, a differentially private study can be more accurate than a non-private study for the same cost, under our model. Finally, we discuss the simplifying assumptions in our model and outline a research agenda for possible refinements.","PeriodicalId":285965,"journal":{"name":"2014 IEEE 27th Computer Security Foundations Symposium","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2014-02-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"267","resultStr":"{\"title\":\"Differential Privacy: An Economic Method for Choosing Epsilon\",\"authors\":\"Justin Hsu, Marco Gaboardi, Andreas Haeberlen, S. Khanna, Arjun Narayan, B. Pierce, Aaron Roth\",\"doi\":\"10.1109/CSF.2014.35\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Differential privacy is becoming a gold standard notion of privacy; it offers a guaranteed bound on loss of privacy due to release of query results, even under worst-case assumptions. The theory of differential privacy is an active research area, and there are now differentially private algorithms for a wide range of problems. However, the question of when differential privacy works in practice has received relatively little attention. In particular, there is still no rigorous method for choosing the key parameter ε, which controls the crucial tradeoff between the strength of the privacy guarantee and the accuracy of the published results. In this paper, we examine the role of these parameters in concrete applications, identifying the key considerations that must be addressed when choosing specific values. This choice requires balancing the interests of two parties with conflicting objectives: the data analyst, who wishes to learn something abou the data, and the prospective participant, who must decide whether to allow their data to be included in the analysis. We propose a simple model that expresses this balance as formulas over a handful of parameters, and we use our model to choose ε on a series of simple statistical studies. We also explore a surprising insight: in some circumstances, a differentially private study can be more accurate than a non-private study for the same cost, under our model. Finally, we discuss the simplifying assumptions in our model and outline a research agenda for possible refinements.\",\"PeriodicalId\":285965,\"journal\":{\"name\":\"2014 IEEE 27th Computer Security Foundations Symposium\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-02-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"267\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2014 IEEE 27th Computer Security Foundations Symposium\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CSF.2014.35\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 IEEE 27th Computer Security Foundations Symposium","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CSF.2014.35","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 267

摘要

差别隐私正成为隐私概念的黄金标准;即使在最坏的情况下,它也提供了由于发布查询结果而导致的隐私损失的保证范围。差分隐私理论是一个活跃的研究领域,目前已经出现了针对各种问题的差分隐私算法。然而,差别隐私在实践中何时起作用的问题受到的关注相对较少。特别是,仍然没有严格的方法来选择关键参数ε,该参数控制隐私保证强度和发布结果准确性之间的关键权衡。在本文中,我们研究了这些参数在具体应用中的作用,确定了在选择特定值时必须考虑的关键因素。这种选择需要平衡目标冲突的两方的利益:希望了解数据的数据分析师和必须决定是否允许将其数据包含在分析中的潜在参与者。我们提出了一个简单的模型,将这种平衡表达为几个参数的公式,并使用我们的模型在一系列简单的统计研究中选择ε。我们还探索了一个令人惊讶的发现:在某些情况下,在我们的模型下,在相同的成本下,不同的私人研究可能比非私人研究更准确。最后,我们讨论了模型中的简化假设,并概述了可能改进的研究议程。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Differential Privacy: An Economic Method for Choosing Epsilon
Differential privacy is becoming a gold standard notion of privacy; it offers a guaranteed bound on loss of privacy due to release of query results, even under worst-case assumptions. The theory of differential privacy is an active research area, and there are now differentially private algorithms for a wide range of problems. However, the question of when differential privacy works in practice has received relatively little attention. In particular, there is still no rigorous method for choosing the key parameter ε, which controls the crucial tradeoff between the strength of the privacy guarantee and the accuracy of the published results. In this paper, we examine the role of these parameters in concrete applications, identifying the key considerations that must be addressed when choosing specific values. This choice requires balancing the interests of two parties with conflicting objectives: the data analyst, who wishes to learn something abou the data, and the prospective participant, who must decide whether to allow their data to be included in the analysis. We propose a simple model that expresses this balance as formulas over a handful of parameters, and we use our model to choose ε on a series of simple statistical studies. We also explore a surprising insight: in some circumstances, a differentially private study can be more accurate than a non-private study for the same cost, under our model. Finally, we discuss the simplifying assumptions in our model and outline a research agenda for possible refinements.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Compositional Information-Flow Security for Interactive Systems Automated Generation of Attack Trees Noninterference under Weak Memory Models TUC: Time-Sensitive and Modular Analysis of Anonymous Communication A Sound Abstraction of the Parsing Problem
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1