{"title":"局部不同私有机制的收缩","authors":"Shahab Asoodeh;Huanyu Zhang","doi":"10.1109/JSAIT.2024.3397305","DOIUrl":null,"url":null,"abstract":"We investigate the contraction properties of locally differentially private mechanisms. More specifically, we derive tight upper bounds on the divergence between \n<inline-formula> <tex-math>$P{\\mathsf K}$ </tex-math></inline-formula>\n and \n<inline-formula> <tex-math>$Q{\\mathsf K}$ </tex-math></inline-formula>\n output distributions of an \n<inline-formula> <tex-math>$\\varepsilon $ </tex-math></inline-formula>\n-LDP mechanism \n<inline-formula> <tex-math>$\\mathsf K$ </tex-math></inline-formula>\n in terms of a divergence between the corresponding input distributions P and Q, respectively. Our first main technical result presents a sharp upper bound on the \n<inline-formula> <tex-math>$\\chi ^{2}$ </tex-math></inline-formula>\n-divergence \n<inline-formula> <tex-math>$\\chi ^{2}(P{\\mathsf K}\\|Q{\\mathsf K})$ </tex-math></inline-formula>\n in terms of \n<inline-formula> <tex-math>$\\chi ^{2}(P\\|Q)$ </tex-math></inline-formula>\n and \n<inline-formula> <tex-math>$\\varepsilon $ </tex-math></inline-formula>\n. We also show that the same result holds for a large family of divergences, including KL-divergence and squared Hellinger distance. The second main technical result gives an upper bound on \n<inline-formula> <tex-math>$\\chi ^{2}(P{\\mathsf K}\\|Q{\\mathsf K})$ </tex-math></inline-formula>\n in terms of total variation distance \n<inline-formula> <tex-math>${\\textsf {TV}}(P, Q)$ </tex-math></inline-formula>\n and \n<inline-formula> <tex-math>$\\varepsilon $ </tex-math></inline-formula>\n. We then utilize these bounds to establish locally private versions of the van Trees inequality, Le Cam’s, Assouad’s, and the mutual information methods —powerful tools for bounding minimax estimation risks. These results are shown to lead to tighter privacy analyses than the state-of-the-arts in several statistical problems such as entropy and discrete distribution estimation, non-parametric density estimation, and hypothesis testing.","PeriodicalId":73295,"journal":{"name":"IEEE journal on selected areas in information theory","volume":"5 ","pages":"385-395"},"PeriodicalIF":0.0000,"publicationDate":"2024-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Contraction of Locally Differentially Private Mechanisms\",\"authors\":\"Shahab Asoodeh;Huanyu Zhang\",\"doi\":\"10.1109/JSAIT.2024.3397305\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We investigate the contraction properties of locally differentially private mechanisms. More specifically, we derive tight upper bounds on the divergence between \\n<inline-formula> <tex-math>$P{\\\\mathsf K}$ </tex-math></inline-formula>\\n and \\n<inline-formula> <tex-math>$Q{\\\\mathsf K}$ </tex-math></inline-formula>\\n output distributions of an \\n<inline-formula> <tex-math>$\\\\varepsilon $ </tex-math></inline-formula>\\n-LDP mechanism \\n<inline-formula> <tex-math>$\\\\mathsf K$ </tex-math></inline-formula>\\n in terms of a divergence between the corresponding input distributions P and Q, respectively. Our first main technical result presents a sharp upper bound on the \\n<inline-formula> <tex-math>$\\\\chi ^{2}$ </tex-math></inline-formula>\\n-divergence \\n<inline-formula> <tex-math>$\\\\chi ^{2}(P{\\\\mathsf K}\\\\|Q{\\\\mathsf K})$ </tex-math></inline-formula>\\n in terms of \\n<inline-formula> <tex-math>$\\\\chi ^{2}(P\\\\|Q)$ </tex-math></inline-formula>\\n and \\n<inline-formula> <tex-math>$\\\\varepsilon $ </tex-math></inline-formula>\\n. We also show that the same result holds for a large family of divergences, including KL-divergence and squared Hellinger distance. The second main technical result gives an upper bound on \\n<inline-formula> <tex-math>$\\\\chi ^{2}(P{\\\\mathsf K}\\\\|Q{\\\\mathsf K})$ </tex-math></inline-formula>\\n in terms of total variation distance \\n<inline-formula> <tex-math>${\\\\textsf {TV}}(P, Q)$ </tex-math></inline-formula>\\n and \\n<inline-formula> <tex-math>$\\\\varepsilon $ </tex-math></inline-formula>\\n. We then utilize these bounds to establish locally private versions of the van Trees inequality, Le Cam’s, Assouad’s, and the mutual information methods —powerful tools for bounding minimax estimation risks. These results are shown to lead to tighter privacy analyses than the state-of-the-arts in several statistical problems such as entropy and discrete distribution estimation, non-parametric density estimation, and hypothesis testing.\",\"PeriodicalId\":73295,\"journal\":{\"name\":\"IEEE journal on selected areas in information theory\",\"volume\":\"5 \",\"pages\":\"385-395\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-03-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE journal on selected areas in information theory\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10527360/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE journal on selected areas in information theory","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10527360/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Contraction of Locally Differentially Private Mechanisms
We investigate the contraction properties of locally differentially private mechanisms. More specifically, we derive tight upper bounds on the divergence between
$P{\mathsf K}$
and
$Q{\mathsf K}$
output distributions of an
$\varepsilon $
-LDP mechanism
$\mathsf K$
in terms of a divergence between the corresponding input distributions P and Q, respectively. Our first main technical result presents a sharp upper bound on the
$\chi ^{2}$
-divergence
$\chi ^{2}(P{\mathsf K}\|Q{\mathsf K})$
in terms of
$\chi ^{2}(P\|Q)$
and
$\varepsilon $
. We also show that the same result holds for a large family of divergences, including KL-divergence and squared Hellinger distance. The second main technical result gives an upper bound on
$\chi ^{2}(P{\mathsf K}\|Q{\mathsf K})$
in terms of total variation distance
${\textsf {TV}}(P, Q)$
and
$\varepsilon $
. We then utilize these bounds to establish locally private versions of the van Trees inequality, Le Cam’s, Assouad’s, and the mutual information methods —powerful tools for bounding minimax estimation risks. These results are shown to lead to tighter privacy analyses than the state-of-the-arts in several statistical problems such as entropy and discrete distribution estimation, non-parametric density estimation, and hypothesis testing.