{"title":"Exploring the Privacy Bound for Differential Privacy: From Theory to Practice","authors":"Xianmang He, Yuan Hong, Yindong Chen","doi":"10.4108/eai.8-4-2019.157414","DOIUrl":null,"url":null,"abstract":"Data privacy has attracted significant interests in both database theory and security communities in the past few decades. Differential privacy has emerged as a new paradigm for rigorous privacy protection regardless of adversaries prior knowledge. However, the meaning of privacy bound and how to select an appropriate may still be unclear to the general data owners. More recently, some approaches have been proposed to derive the upper bounds of for specified privacy risks. Unfortunately, these upper bounds suffer from some deficiencies (e.g., the bound relies on the data size, or might be too large), which greatly limits their applicability. To remedy this problem, we propose a novel approach that converts the privacy bound in differential privacy to privacy risks understandable to generic users, and present an in-depth theoretical analysis for it. Finally, we have conducted experiments to demonstrate the effectiveness of our model. Received on 19 December 2018; accepted on 21 January 2019; published on 25 January 2019","PeriodicalId":335727,"journal":{"name":"EAI Endorsed Trans. Security Safety","volume":"31 3","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-01-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"EAI Endorsed Trans. Security Safety","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4108/eai.8-4-2019.157414","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Data privacy has attracted significant interests in both database theory and security communities in the past few decades. Differential privacy has emerged as a new paradigm for rigorous privacy protection regardless of adversaries prior knowledge. However, the meaning of privacy bound and how to select an appropriate may still be unclear to the general data owners. More recently, some approaches have been proposed to derive the upper bounds of for specified privacy risks. Unfortunately, these upper bounds suffer from some deficiencies (e.g., the bound relies on the data size, or might be too large), which greatly limits their applicability. To remedy this problem, we propose a novel approach that converts the privacy bound in differential privacy to privacy risks understandable to generic users, and present an in-depth theoretical analysis for it. Finally, we have conducted experiments to demonstrate the effectiveness of our model. Received on 19 December 2018; accepted on 21 January 2019; published on 25 January 2019