Li Wang, Quangui Zhang, Lei Sang, Qiang Wu, Min Xu
{"title":"基于联合原型的对比学习,实现保护隐私的跨域推荐","authors":"Li Wang, Quangui Zhang, Lei Sang, Qiang Wu, Min Xu","doi":"arxiv-2409.03294","DOIUrl":null,"url":null,"abstract":"Cross-domain recommendation (CDR) aims to improve recommendation accuracy in\nsparse domains by transferring knowledge from data-rich domains. However,\nexisting CDR methods often assume the availability of user-item interaction\ndata across domains, overlooking user privacy concerns. Furthermore, these\nmethods suffer from performance degradation in scenarios with sparse\noverlapping users, as they typically depend on a large number of fully shared\nusers for effective knowledge transfer. To address these challenges, we propose\na Federated Prototype-based Contrastive Learning (CL) method for\nPrivacy-Preserving CDR, named FedPCL-CDR. This approach utilizes\nnon-overlapping user information and prototypes to improve multi-domain\nperformance while protecting user privacy. FedPCL-CDR comprises two modules:\nlocal domain (client) learning and global server aggregation. In the local\ndomain, FedPCL-CDR clusters all user data to learn representative prototypes,\neffectively utilizing non-overlapping user information and addressing the\nsparse overlapping user issue. It then facilitates knowledge transfer by\nemploying both local and global prototypes returned from the server in a CL\nmanner. Simultaneously, the global server aggregates representative prototypes\nfrom local domains to learn both local and global prototypes. The combination\nof prototypes and federated learning (FL) ensures that sensitive user data\nremains decentralized, with only prototypes being shared across domains,\nthereby protecting user privacy. Extensive experiments on four CDR tasks using\ntwo real-world datasets demonstrate that FedPCL-CDR outperforms the\nstate-of-the-art baselines.","PeriodicalId":501281,"journal":{"name":"arXiv - CS - Information Retrieval","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Federated Prototype-based Contrastive Learning for Privacy-Preserving Cross-domain Recommendation\",\"authors\":\"Li Wang, Quangui Zhang, Lei Sang, Qiang Wu, Min Xu\",\"doi\":\"arxiv-2409.03294\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Cross-domain recommendation (CDR) aims to improve recommendation accuracy in\\nsparse domains by transferring knowledge from data-rich domains. However,\\nexisting CDR methods often assume the availability of user-item interaction\\ndata across domains, overlooking user privacy concerns. Furthermore, these\\nmethods suffer from performance degradation in scenarios with sparse\\noverlapping users, as they typically depend on a large number of fully shared\\nusers for effective knowledge transfer. To address these challenges, we propose\\na Federated Prototype-based Contrastive Learning (CL) method for\\nPrivacy-Preserving CDR, named FedPCL-CDR. This approach utilizes\\nnon-overlapping user information and prototypes to improve multi-domain\\nperformance while protecting user privacy. FedPCL-CDR comprises two modules:\\nlocal domain (client) learning and global server aggregation. In the local\\ndomain, FedPCL-CDR clusters all user data to learn representative prototypes,\\neffectively utilizing non-overlapping user information and addressing the\\nsparse overlapping user issue. It then facilitates knowledge transfer by\\nemploying both local and global prototypes returned from the server in a CL\\nmanner. Simultaneously, the global server aggregates representative prototypes\\nfrom local domains to learn both local and global prototypes. The combination\\nof prototypes and federated learning (FL) ensures that sensitive user data\\nremains decentralized, with only prototypes being shared across domains,\\nthereby protecting user privacy. Extensive experiments on four CDR tasks using\\ntwo real-world datasets demonstrate that FedPCL-CDR outperforms the\\nstate-of-the-art baselines.\",\"PeriodicalId\":501281,\"journal\":{\"name\":\"arXiv - CS - Information Retrieval\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Information Retrieval\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.03294\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Information Retrieval","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.03294","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Federated Prototype-based Contrastive Learning for Privacy-Preserving Cross-domain Recommendation
Cross-domain recommendation (CDR) aims to improve recommendation accuracy in
sparse domains by transferring knowledge from data-rich domains. However,
existing CDR methods often assume the availability of user-item interaction
data across domains, overlooking user privacy concerns. Furthermore, these
methods suffer from performance degradation in scenarios with sparse
overlapping users, as they typically depend on a large number of fully shared
users for effective knowledge transfer. To address these challenges, we propose
a Federated Prototype-based Contrastive Learning (CL) method for
Privacy-Preserving CDR, named FedPCL-CDR. This approach utilizes
non-overlapping user information and prototypes to improve multi-domain
performance while protecting user privacy. FedPCL-CDR comprises two modules:
local domain (client) learning and global server aggregation. In the local
domain, FedPCL-CDR clusters all user data to learn representative prototypes,
effectively utilizing non-overlapping user information and addressing the
sparse overlapping user issue. It then facilitates knowledge transfer by
employing both local and global prototypes returned from the server in a CL
manner. Simultaneously, the global server aggregates representative prototypes
from local domains to learn both local and global prototypes. The combination
of prototypes and federated learning (FL) ensures that sensitive user data
remains decentralized, with only prototypes being shared across domains,
thereby protecting user privacy. Extensive experiments on four CDR tasks using
two real-world datasets demonstrate that FedPCL-CDR outperforms the
state-of-the-art baselines.