可部署的隐私保护协作式机器学习

Nandish Chattopadhyay, Ritabrata Maiti, A. Chattopadhyay
{"title":"可部署的隐私保护协作式机器学习","authors":"Nandish Chattopadhyay, Ritabrata Maiti, A. Chattopadhyay","doi":"10.1109/ICDCS47774.2020.00184","DOIUrl":null,"url":null,"abstract":"In the data-driven world, emerging technologies like the Internet of Things (IoT) and other crowd-sourced data sources like mobile devices etc. generate a tremendous volume of decentralized data that needs to be analyzed for obtaining useful insights, necessary for reliable decision making. Although the overall data is rich, contributors of such kind of data are reluctant to share their own data due to serious concerns regarding protection of their privacy; while those interested in harvesting the data are constrained by the limited computational resources available with each participant. In this paper, we propose an end-to-end algorithm that puts in coalescence the mechanism of learning collaboratively in a decentralized fashion, using Federated Learning, while preserving differential privacy of each participating client, which are typically conceived as resource-constrained edge devices. We have developed the proposed infrastructure and analyzed its performance from the standpoint of a machine learning task using standard metrics. We observed that the collaborative learning framework actually increases prediction capabilities in comparison to a centrally trained model (by 1-2%), without having to share data amongst the participants, while strong guarantees on privacy (ϵ, δ) can be provided with some compromise on performance (about 2-4%). Additionally, quantization of the model for deployment on edge devices do not degrade its capability, whilst enhancing the overall system efficiency.","PeriodicalId":158630,"journal":{"name":"2020 IEEE 40th International Conference on Distributed Computing Systems (ICDCS)","volume":"51 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Deploy-able Privacy Preserving Collaborative ML\",\"authors\":\"Nandish Chattopadhyay, Ritabrata Maiti, A. Chattopadhyay\",\"doi\":\"10.1109/ICDCS47774.2020.00184\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In the data-driven world, emerging technologies like the Internet of Things (IoT) and other crowd-sourced data sources like mobile devices etc. generate a tremendous volume of decentralized data that needs to be analyzed for obtaining useful insights, necessary for reliable decision making. Although the overall data is rich, contributors of such kind of data are reluctant to share their own data due to serious concerns regarding protection of their privacy; while those interested in harvesting the data are constrained by the limited computational resources available with each participant. In this paper, we propose an end-to-end algorithm that puts in coalescence the mechanism of learning collaboratively in a decentralized fashion, using Federated Learning, while preserving differential privacy of each participating client, which are typically conceived as resource-constrained edge devices. We have developed the proposed infrastructure and analyzed its performance from the standpoint of a machine learning task using standard metrics. We observed that the collaborative learning framework actually increases prediction capabilities in comparison to a centrally trained model (by 1-2%), without having to share data amongst the participants, while strong guarantees on privacy (ϵ, δ) can be provided with some compromise on performance (about 2-4%). Additionally, quantization of the model for deployment on edge devices do not degrade its capability, whilst enhancing the overall system efficiency.\",\"PeriodicalId\":158630,\"journal\":{\"name\":\"2020 IEEE 40th International Conference on Distributed Computing Systems (ICDCS)\",\"volume\":\"51 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE 40th International Conference on Distributed Computing Systems (ICDCS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICDCS47774.2020.00184\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE 40th International Conference on Distributed Computing Systems (ICDCS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDCS47774.2020.00184","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

在数据驱动的世界中,物联网(IoT)等新兴技术和其他众包数据源(如移动设备等)产生了大量分散的数据,需要对这些数据进行分析,以获得有用的见解,这是可靠决策所必需的。虽然整体数据丰富,但由于对隐私保护的严重担忧,此类数据的提供者不愿分享自己的数据;而那些对收集数据感兴趣的人则受到每个参与者可用的有限计算资源的限制。在本文中,我们提出了一种端到端算法,该算法使用联邦学习以分散的方式将协作学习机制合并在一起,同时保留每个参与客户端的差异隐私,这些客户端通常被认为是资源受限的边缘设备。我们已经开发了提议的基础设施,并使用标准指标从机器学习任务的角度分析了其性能。我们观察到,与集中训练的模型相比,协作学习框架实际上提高了预测能力(提高了1-2%),而无需在参与者之间共享数据,而对隐私(λ, δ)的强有力保证可以在性能上做出一些妥协(约2-4%)。此外,在边缘设备上部署模型的量化不会降低其能力,同时提高了整体系统效率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Deploy-able Privacy Preserving Collaborative ML
In the data-driven world, emerging technologies like the Internet of Things (IoT) and other crowd-sourced data sources like mobile devices etc. generate a tremendous volume of decentralized data that needs to be analyzed for obtaining useful insights, necessary for reliable decision making. Although the overall data is rich, contributors of such kind of data are reluctant to share their own data due to serious concerns regarding protection of their privacy; while those interested in harvesting the data are constrained by the limited computational resources available with each participant. In this paper, we propose an end-to-end algorithm that puts in coalescence the mechanism of learning collaboratively in a decentralized fashion, using Federated Learning, while preserving differential privacy of each participating client, which are typically conceived as resource-constrained edge devices. We have developed the proposed infrastructure and analyzed its performance from the standpoint of a machine learning task using standard metrics. We observed that the collaborative learning framework actually increases prediction capabilities in comparison to a centrally trained model (by 1-2%), without having to share data amongst the participants, while strong guarantees on privacy (ϵ, δ) can be provided with some compromise on performance (about 2-4%). Additionally, quantization of the model for deployment on edge devices do not degrade its capability, whilst enhancing the overall system efficiency.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
An Energy-Efficient Edge Offloading Scheme for UAV-Assisted Internet of Things Kill Two Birds with One Stone: Auto-tuning RocksDB for High Bandwidth and Low Latency BlueFi: Physical-layer Cross-Technology Communication from Bluetooth to WiFi [Title page i] Distributionally Robust Edge Learning with Dirichlet Process Prior
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1