Zhan Liu, Gan Zhou, Yanjun Feng, Jing Zhang, Ying Zeng, Long Jin
{"title":"基于信道关注的轻型卷积网络非侵入式负荷监测","authors":"Zhan Liu, Gan Zhou, Yanjun Feng, Jing Zhang, Ying Zeng, Long Jin","doi":"10.1109/CEEPE58418.2023.10167082","DOIUrl":null,"url":null,"abstract":"Non-intrusive load monitoring (NILM) is the basis of end-side informatization applications in smart grids. At the same time, the fine-grained power consumption of equipment decomposed by non-intrusive load monitoring algorithms also plays an important role in adjusting the power consumption structure. At present, deep neural network has become the focus of research in the field of non-intrusive load identification, but most neural network models only focus on how to improve the identification accuracy, while ignoring the network size requirements of hardware monitoring devices in the actual deployment process. In this paper, we propose a lightweight convolutional neural network combined with channel attention mechanism (LACNet). Through the serialized multi-scale dilated convolution structure, while increasing the receptive field, reducing the parameters to achieve the purpose of compressing the model size, and also using the channel attention mechanism to optimize different features to improve the model identification accuracy. Finally, we conducted experimental verification on the public dataset UK-DALE. The results showed that LACNet outperforms several existing load identification networks in terms of EA and other evaluation metrics. At the same time, the model parameters are also greatly reduced.","PeriodicalId":431552,"journal":{"name":"2023 6th International Conference on Energy, Electrical and Power Engineering (CEEPE)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Lightweight Convolutional Network Combined with Channel Attention for Non-intrusive Load Monitoring\",\"authors\":\"Zhan Liu, Gan Zhou, Yanjun Feng, Jing Zhang, Ying Zeng, Long Jin\",\"doi\":\"10.1109/CEEPE58418.2023.10167082\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Non-intrusive load monitoring (NILM) is the basis of end-side informatization applications in smart grids. At the same time, the fine-grained power consumption of equipment decomposed by non-intrusive load monitoring algorithms also plays an important role in adjusting the power consumption structure. At present, deep neural network has become the focus of research in the field of non-intrusive load identification, but most neural network models only focus on how to improve the identification accuracy, while ignoring the network size requirements of hardware monitoring devices in the actual deployment process. In this paper, we propose a lightweight convolutional neural network combined with channel attention mechanism (LACNet). Through the serialized multi-scale dilated convolution structure, while increasing the receptive field, reducing the parameters to achieve the purpose of compressing the model size, and also using the channel attention mechanism to optimize different features to improve the model identification accuracy. Finally, we conducted experimental verification on the public dataset UK-DALE. The results showed that LACNet outperforms several existing load identification networks in terms of EA and other evaluation metrics. At the same time, the model parameters are also greatly reduced.\",\"PeriodicalId\":431552,\"journal\":{\"name\":\"2023 6th International Conference on Energy, Electrical and Power Engineering (CEEPE)\",\"volume\":\"43 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-05-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 6th International Conference on Energy, Electrical and Power Engineering (CEEPE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CEEPE58418.2023.10167082\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 6th International Conference on Energy, Electrical and Power Engineering (CEEPE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CEEPE58418.2023.10167082","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Lightweight Convolutional Network Combined with Channel Attention for Non-intrusive Load Monitoring
Non-intrusive load monitoring (NILM) is the basis of end-side informatization applications in smart grids. At the same time, the fine-grained power consumption of equipment decomposed by non-intrusive load monitoring algorithms also plays an important role in adjusting the power consumption structure. At present, deep neural network has become the focus of research in the field of non-intrusive load identification, but most neural network models only focus on how to improve the identification accuracy, while ignoring the network size requirements of hardware monitoring devices in the actual deployment process. In this paper, we propose a lightweight convolutional neural network combined with channel attention mechanism (LACNet). Through the serialized multi-scale dilated convolution structure, while increasing the receptive field, reducing the parameters to achieve the purpose of compressing the model size, and also using the channel attention mechanism to optimize different features to improve the model identification accuracy. Finally, we conducted experimental verification on the public dataset UK-DALE. The results showed that LACNet outperforms several existing load identification networks in terms of EA and other evaluation metrics. At the same time, the model parameters are also greatly reduced.