Ryan M. Rogers, Aaron Roth, Adam D. Smith, Om Thakkar
{"title":"最大信息、差异隐私和选择后假设检验","authors":"Ryan M. Rogers, Aaron Roth, Adam D. Smith, Om Thakkar","doi":"10.1109/FOCS.2016.59","DOIUrl":null,"url":null,"abstract":"In this paper, we initiate a principled study of how the generalization properties of approximate differential privacy can be used to perform adaptive hypothesis testing, while giving statistically valid p-value corrections. We do this by observing that the guarantees of algorithms with bounded approximate max-information are sufficient to correct the p-values of adaptively chosen hypotheses, and then by proving that algorithms that satisfy (∈,δ)-differential privacy have bounded approximate max information when their inputs are drawn from a product distribution. This substantially extends the known connection between differential privacy and max-information, which previously was only known to hold for (pure) (∈,0)-differential privacy. It also extends our understanding of max-information as a partially unifying measure controlling the generalization properties of adaptive data analyses. We also show a lower bound, proving that (despite the strong composition properties of max-information), when data is drawn from a product distribution, (∈,δ)-differentially private algorithms can come first in a composition with other algorithms satisfying max-information bounds, but not necessarily second if the composition is required to itself satisfy a nontrivial max-information bound. This, in particular, implies that the connection between (∈,δ)-differential privacy and max-information holds only for inputs drawn from product distributions, unlike the connection between (∈,0)-differential privacy and max-information.","PeriodicalId":414001,"journal":{"name":"2016 IEEE 57th Annual Symposium on Foundations of Computer Science (FOCS)","volume":"56 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-04-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"80","resultStr":"{\"title\":\"Max-Information, Differential Privacy, and Post-selection Hypothesis Testing\",\"authors\":\"Ryan M. Rogers, Aaron Roth, Adam D. Smith, Om Thakkar\",\"doi\":\"10.1109/FOCS.2016.59\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we initiate a principled study of how the generalization properties of approximate differential privacy can be used to perform adaptive hypothesis testing, while giving statistically valid p-value corrections. We do this by observing that the guarantees of algorithms with bounded approximate max-information are sufficient to correct the p-values of adaptively chosen hypotheses, and then by proving that algorithms that satisfy (∈,δ)-differential privacy have bounded approximate max information when their inputs are drawn from a product distribution. This substantially extends the known connection between differential privacy and max-information, which previously was only known to hold for (pure) (∈,0)-differential privacy. It also extends our understanding of max-information as a partially unifying measure controlling the generalization properties of adaptive data analyses. We also show a lower bound, proving that (despite the strong composition properties of max-information), when data is drawn from a product distribution, (∈,δ)-differentially private algorithms can come first in a composition with other algorithms satisfying max-information bounds, but not necessarily second if the composition is required to itself satisfy a nontrivial max-information bound. This, in particular, implies that the connection between (∈,δ)-differential privacy and max-information holds only for inputs drawn from product distributions, unlike the connection between (∈,0)-differential privacy and max-information.\",\"PeriodicalId\":414001,\"journal\":{\"name\":\"2016 IEEE 57th Annual Symposium on Foundations of Computer Science (FOCS)\",\"volume\":\"56 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-04-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"80\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 IEEE 57th Annual Symposium on Foundations of Computer Science (FOCS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/FOCS.2016.59\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 IEEE 57th Annual Symposium on Foundations of Computer Science (FOCS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/FOCS.2016.59","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Max-Information, Differential Privacy, and Post-selection Hypothesis Testing
In this paper, we initiate a principled study of how the generalization properties of approximate differential privacy can be used to perform adaptive hypothesis testing, while giving statistically valid p-value corrections. We do this by observing that the guarantees of algorithms with bounded approximate max-information are sufficient to correct the p-values of adaptively chosen hypotheses, and then by proving that algorithms that satisfy (∈,δ)-differential privacy have bounded approximate max information when their inputs are drawn from a product distribution. This substantially extends the known connection between differential privacy and max-information, which previously was only known to hold for (pure) (∈,0)-differential privacy. It also extends our understanding of max-information as a partially unifying measure controlling the generalization properties of adaptive data analyses. We also show a lower bound, proving that (despite the strong composition properties of max-information), when data is drawn from a product distribution, (∈,δ)-differentially private algorithms can come first in a composition with other algorithms satisfying max-information bounds, but not necessarily second if the composition is required to itself satisfy a nontrivial max-information bound. This, in particular, implies that the connection between (∈,δ)-differential privacy and max-information holds only for inputs drawn from product distributions, unlike the connection between (∈,0)-differential privacy and max-information.