{"title":"Decentralized Federated Recommendation with Privacy-Aware Structured Client-Level Graph","authors":"Zhitao Li, Zhaohao Lin, Feng Liang, Weike Pan, Qiang Yang, Zhong Ming","doi":"10.1145/3641287","DOIUrl":null,"url":null,"abstract":"<p>Recommendation models are deployed in a variety of commercial applications in order to provide personalized services for users. </p><p>However, most of them rely on the users’ original rating records that are often collected by a centralized server for model training, which may cause privacy issues. </p><p>Recently, some centralized federated recommendation models are proposed for the protection of users’ privacy, which however requires a server for coordination in the whole process of model training. </p><p>As a response, we propose a novel privacy-aware decentralized federated recommendation (DFedRec) model, which is lossless compared with the traditional model in recommendation performance and is thus more accurate than other models in this line. </p><p>Specifically, we design a privacy-aware structured client-level graph for the sharing of the model parameters in the process of model training, which is a one-stone-two-bird strategy, i.e., it protects users’ privacy via some randomly sampled fake entries and reduces the communication cost by sharing the model parameters only with the related neighboring users. </p><p>With the help of the privacy-aware structured client-level graph, we propose two novel collaborative training mechanisms in the setting without a server, including a batch algorithm DFedRec(b) and a stochastic one DFedRec(s), where the former requires the anonymity mechanism while the latter does not. They are both equivalent to PMF trained in a centralized server and are thus lossless. </p><p>We then provide formal analysis of privacy guarantee of our methods and conduct extensive empirical studies on three public datasets with explicit feedback, which show the effectiveness of our DFedRec, i.e., it is privacy aware, communication efficient, and lossless.</p>","PeriodicalId":48967,"journal":{"name":"ACM Transactions on Intelligent Systems and Technology","volume":"41 1","pages":""},"PeriodicalIF":7.2000,"publicationDate":"2024-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on Intelligent Systems and Technology","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1145/3641287","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Recommendation models are deployed in a variety of commercial applications in order to provide personalized services for users.
However, most of them rely on the users’ original rating records that are often collected by a centralized server for model training, which may cause privacy issues.
Recently, some centralized federated recommendation models are proposed for the protection of users’ privacy, which however requires a server for coordination in the whole process of model training.
As a response, we propose a novel privacy-aware decentralized federated recommendation (DFedRec) model, which is lossless compared with the traditional model in recommendation performance and is thus more accurate than other models in this line.
Specifically, we design a privacy-aware structured client-level graph for the sharing of the model parameters in the process of model training, which is a one-stone-two-bird strategy, i.e., it protects users’ privacy via some randomly sampled fake entries and reduces the communication cost by sharing the model parameters only with the related neighboring users.
With the help of the privacy-aware structured client-level graph, we propose two novel collaborative training mechanisms in the setting without a server, including a batch algorithm DFedRec(b) and a stochastic one DFedRec(s), where the former requires the anonymity mechanism while the latter does not. They are both equivalent to PMF trained in a centralized server and are thus lossless.
We then provide formal analysis of privacy guarantee of our methods and conduct extensive empirical studies on three public datasets with explicit feedback, which show the effectiveness of our DFedRec, i.e., it is privacy aware, communication efficient, and lossless.
期刊介绍:
ACM Transactions on Intelligent Systems and Technology is a scholarly journal that publishes the highest quality papers on intelligent systems, applicable algorithms and technology with a multi-disciplinary perspective. An intelligent system is one that uses artificial intelligence (AI) techniques to offer important services (e.g., as a component of a larger system) to allow integrated systems to perceive, reason, learn, and act intelligently in the real world.
ACM TIST is published quarterly (six issues a year). Each issue has 8-11 regular papers, with around 20 published journal pages or 10,000 words per paper. Additional references, proofs, graphs or detailed experiment results can be submitted as a separate appendix, while excessively lengthy papers will be rejected automatically. Authors can include online-only appendices for additional content of their published papers and are encouraged to share their code and/or data with other readers.