{"title":"一种利用局部差分隐私降低派生数据连通性的方法","authors":"Hidenobu Oguri","doi":"10.1109/ECAI46879.2019.9042011","DOIUrl":null,"url":null,"abstract":"A lot of personal data in the company are processed into various formats for each purpose of use, such as aggregate tables, and are generally stored as derived data. After the enforcement of the GDPR, when the user exercises “right to the erasure of personal data”, the companies are obliged to delete any link, or copy of the data taking all reasonable measures. On the other hand, since the data necessary for companies to comply with legal obligations should be retained, risk assessment of data to be deleted and data to be left is necessary. However, many derived data can be combined and the original data may be restored, and it is difficult to determine whether the data should be deleted. In this paper, we propose a method to measure the connectability of each attribute between derived data and manage the relationship by a graph structure. Then, by searching as a route the connectivity between the derived data, we measure the risk of connecting and restoring personal data. Using this structure, we propose a method to reduce connectability by using local differential privacy to disturb only the attribute with the highest connectability among the searched routes. And we also propose a measurement method of privacy protection index necessary to process to the level that cannot distinguish the users when two people were extracted from a database and applied differential privacy, and the effect was verified by experiments.","PeriodicalId":285780,"journal":{"name":"2019 11th International Conference on Electronics, Computers and Artificial Intelligence (ECAI)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A method of decreasing connectability of derived data, using local differential privacy\",\"authors\":\"Hidenobu Oguri\",\"doi\":\"10.1109/ECAI46879.2019.9042011\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A lot of personal data in the company are processed into various formats for each purpose of use, such as aggregate tables, and are generally stored as derived data. After the enforcement of the GDPR, when the user exercises “right to the erasure of personal data”, the companies are obliged to delete any link, or copy of the data taking all reasonable measures. On the other hand, since the data necessary for companies to comply with legal obligations should be retained, risk assessment of data to be deleted and data to be left is necessary. However, many derived data can be combined and the original data may be restored, and it is difficult to determine whether the data should be deleted. In this paper, we propose a method to measure the connectability of each attribute between derived data and manage the relationship by a graph structure. Then, by searching as a route the connectivity between the derived data, we measure the risk of connecting and restoring personal data. Using this structure, we propose a method to reduce connectability by using local differential privacy to disturb only the attribute with the highest connectability among the searched routes. And we also propose a measurement method of privacy protection index necessary to process to the level that cannot distinguish the users when two people were extracted from a database and applied differential privacy, and the effect was verified by experiments.\",\"PeriodicalId\":285780,\"journal\":{\"name\":\"2019 11th International Conference on Electronics, Computers and Artificial Intelligence (ECAI)\",\"volume\":\"14 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 11th International Conference on Electronics, Computers and Artificial Intelligence (ECAI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ECAI46879.2019.9042011\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 11th International Conference on Electronics, Computers and Artificial Intelligence (ECAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ECAI46879.2019.9042011","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A method of decreasing connectability of derived data, using local differential privacy
A lot of personal data in the company are processed into various formats for each purpose of use, such as aggregate tables, and are generally stored as derived data. After the enforcement of the GDPR, when the user exercises “right to the erasure of personal data”, the companies are obliged to delete any link, or copy of the data taking all reasonable measures. On the other hand, since the data necessary for companies to comply with legal obligations should be retained, risk assessment of data to be deleted and data to be left is necessary. However, many derived data can be combined and the original data may be restored, and it is difficult to determine whether the data should be deleted. In this paper, we propose a method to measure the connectability of each attribute between derived data and manage the relationship by a graph structure. Then, by searching as a route the connectivity between the derived data, we measure the risk of connecting and restoring personal data. Using this structure, we propose a method to reduce connectability by using local differential privacy to disturb only the attribute with the highest connectability among the searched routes. And we also propose a measurement method of privacy protection index necessary to process to the level that cannot distinguish the users when two people were extracted from a database and applied differential privacy, and the effect was verified by experiments.