Fed-NILM:一种用于隐私保护的基于联邦学习的非侵入式负载监控方法

Haijin Wang, Caomingzhe Si, Guolong Liu, Junhua Zhao, Fushuan Wen, Yusheng Xue
{"title":"Fed-NILM:一种用于隐私保护的基于联邦学习的非侵入式负载监控方法","authors":"Haijin Wang,&nbsp;Caomingzhe Si,&nbsp;Guolong Liu,&nbsp;Junhua Zhao,&nbsp;Fushuan Wen,&nbsp;Yusheng Xue","doi":"10.1049/enc2.12055","DOIUrl":null,"url":null,"abstract":"<p>Non-intrusive load monitoring (NILM) is essential for understanding consumer power consumption patterns and may have wide applications such as in carbon emission reduction and energy conservation. Determining NILM models requires massive load data containing different types of appliances. However, inadequate load data and the risk of power consumer privacy breaches may be encountered by local data owners when determining the NILM model. To address these problems, a novel NILM method based on federated learning (FL) called Fed-NILM is proposed. In Fed-NILM, instead of local load data, local model parameters are shared among multiple data owners. The global NILM model is obtained by averaging the parameters with the appropriate weights. Experiments based on two measured load datasets are performed to explore the generalization capability of Fed-NILM. In addition, a comparison of Fed-NILM with locally trained NILM models and the centrally trained NILM model is conducted. Experimental results show that the Fed-NILM exhibits superior performance in terms of scalability and convergence. Fed-NILM out performs locally trained NILM models operated by local data owners and approaches the centrally trained NILM model, which is trained on the entire load dataset without privacy protection. The proposed Fed-NILM significantly improves the co-modelling capabilities of local data owners while protecting the privacy of power consumers.</p>","PeriodicalId":100467,"journal":{"name":"Energy Conversion and Economics","volume":"3 2","pages":"51-60"},"PeriodicalIF":0.0000,"publicationDate":"2022-04-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ietresearch.onlinelibrary.wiley.com/doi/epdf/10.1049/enc2.12055","citationCount":"11","resultStr":"{\"title\":\"Fed-NILM: A federated learning-based non-intrusive load monitoring method for privacy-protection\",\"authors\":\"Haijin Wang,&nbsp;Caomingzhe Si,&nbsp;Guolong Liu,&nbsp;Junhua Zhao,&nbsp;Fushuan Wen,&nbsp;Yusheng Xue\",\"doi\":\"10.1049/enc2.12055\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Non-intrusive load monitoring (NILM) is essential for understanding consumer power consumption patterns and may have wide applications such as in carbon emission reduction and energy conservation. Determining NILM models requires massive load data containing different types of appliances. However, inadequate load data and the risk of power consumer privacy breaches may be encountered by local data owners when determining the NILM model. To address these problems, a novel NILM method based on federated learning (FL) called Fed-NILM is proposed. In Fed-NILM, instead of local load data, local model parameters are shared among multiple data owners. The global NILM model is obtained by averaging the parameters with the appropriate weights. Experiments based on two measured load datasets are performed to explore the generalization capability of Fed-NILM. In addition, a comparison of Fed-NILM with locally trained NILM models and the centrally trained NILM model is conducted. Experimental results show that the Fed-NILM exhibits superior performance in terms of scalability and convergence. Fed-NILM out performs locally trained NILM models operated by local data owners and approaches the centrally trained NILM model, which is trained on the entire load dataset without privacy protection. The proposed Fed-NILM significantly improves the co-modelling capabilities of local data owners while protecting the privacy of power consumers.</p>\",\"PeriodicalId\":100467,\"journal\":{\"name\":\"Energy Conversion and Economics\",\"volume\":\"3 2\",\"pages\":\"51-60\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-04-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ietresearch.onlinelibrary.wiley.com/doi/epdf/10.1049/enc2.12055\",\"citationCount\":\"11\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Energy Conversion and Economics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1049/enc2.12055\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Energy Conversion and Economics","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1049/enc2.12055","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 11

摘要

非侵入式负荷监测(NILM)对于了解消费者用电模式至关重要,在碳减排和节能等领域具有广泛的应用前景。确定NILM模型需要包含不同类型设备的大量负载数据。然而,在确定NILM模型时,本地数据所有者可能会遇到负载数据不足和电力消费者隐私泄露的风险。为了解决这些问题,提出了一种新的基于联邦学习(FL)的NILM方法,称为Fed-NILM。在Fed-NILM中,局部模型参数在多个数据所有者之间共享,而不是局部负载数据。通过对参数进行加权平均,得到全局NILM模型。在两个实测负荷数据集上进行了实验,探讨了Fed-NILM的泛化能力。此外,还将Fed-NILM与局部训练的NILM模型和中央训练的NILM模型进行了比较。实验结果表明,Fed-NILM在可扩展性和收敛性方面表现出优异的性能。Fed-NILM执行由本地数据所有者操作的本地训练的NILM模型,并接近中央训练的NILM模型,该模型在整个负载数据集上进行训练,没有隐私保护。提出的Fed-NILM显著提高了本地数据所有者的协同建模能力,同时保护了电力消费者的隐私。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

摘要图片

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Fed-NILM: A federated learning-based non-intrusive load monitoring method for privacy-protection

Non-intrusive load monitoring (NILM) is essential for understanding consumer power consumption patterns and may have wide applications such as in carbon emission reduction and energy conservation. Determining NILM models requires massive load data containing different types of appliances. However, inadequate load data and the risk of power consumer privacy breaches may be encountered by local data owners when determining the NILM model. To address these problems, a novel NILM method based on federated learning (FL) called Fed-NILM is proposed. In Fed-NILM, instead of local load data, local model parameters are shared among multiple data owners. The global NILM model is obtained by averaging the parameters with the appropriate weights. Experiments based on two measured load datasets are performed to explore the generalization capability of Fed-NILM. In addition, a comparison of Fed-NILM with locally trained NILM models and the centrally trained NILM model is conducted. Experimental results show that the Fed-NILM exhibits superior performance in terms of scalability and convergence. Fed-NILM out performs locally trained NILM models operated by local data owners and approaches the centrally trained NILM model, which is trained on the entire load dataset without privacy protection. The proposed Fed-NILM significantly improves the co-modelling capabilities of local data owners while protecting the privacy of power consumers.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A novel online reinforcement learning-based linear quadratic regulator for three-level neutral-point clamped DC/AC inverter Artificial intelligence-driven insights: Precision tracking of power plant carbon emissions using satellite data Forecasting masked-load with invisible distributed energy resources based on transfer learning and Bayesian tuning Collaborative deployment of multiple reinforcement methods for network-loss reduction in distribution system with seasonal loads State-of-health estimation of lithium-ion batteries: A comprehensive literature review from cell to pack levels
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1