FLACON: A Deep Federated Transfer Learning-Enabled Transient Stability Assessment During Symmetrical and Asymmetrical Grid Faults

IF 7.9 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC IEEE Open Journal of Industry Applications Pub Date : 2024-07-10 DOI:10.1109/OJIA.2024.3426281
Mohamed Massaoudi;Haitham Abu-Rub;Ali Ghrayeb
{"title":"FLACON: A Deep Federated Transfer Learning-Enabled Transient Stability Assessment During Symmetrical and Asymmetrical Grid Faults","authors":"Mohamed Massaoudi;Haitham Abu-Rub;Ali Ghrayeb","doi":"10.1109/OJIA.2024.3426281","DOIUrl":null,"url":null,"abstract":"Transient stability assessment (TSA) is critical to the reliable operation of a power system against severe fault conditions. In practice, TSA based on deep learning is preferable for its high accuracy but often overlooks challenges in maintaining data privacy while coping with network topology changes. This article proposes an innovative \n<underline>f</u>\nocal \n<underline>l</u>\noss-based multihead \n<underline>a</u>\nttention \n<underline>co</u>\nnvolutional \n<underline>n</u>\network (FLACON) for accurate post-disturbance TSA under both symmetrical and asymmetrical smart grid faults. The proposed approach effectively incorporates cross-domain deep federated transfer learning (FTL) to leverage local operating data for TSA in a decentralized fashion. By introducing convolutional layers alongside multi-head attention mechanisms, the FLACON framework significantly improves learning efficiency across geographically distributed datasets. To address the challenge of class imbalance, the model integrates a balance factor-enhanced focal loss function. The FTL architecture enables decentralized model training across various clients, thus preserving data privacy and reducing the burden of communication overhead. To avoid the constant adjustment of hyperparameters, the FLACON employs an inductive transfer learning approach for hyperparameter tuning of the pre-trained model, markedly decreasing training time. Extensive experiments on datasets from the IEEE 39-bus system and the IEEE 68-bus system demonstrate FLACON's exceptional accuracy of 98.98% compared to some competitive alternatives.","PeriodicalId":100629,"journal":{"name":"IEEE Open Journal of Industry Applications","volume":"5 ","pages":"253-266"},"PeriodicalIF":7.9000,"publicationDate":"2024-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10592792","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Open Journal of Industry Applications","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10592792/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Transient stability assessment (TSA) is critical to the reliable operation of a power system against severe fault conditions. In practice, TSA based on deep learning is preferable for its high accuracy but often overlooks challenges in maintaining data privacy while coping with network topology changes. This article proposes an innovative f ocal l oss-based multihead a ttention co nvolutional n etwork (FLACON) for accurate post-disturbance TSA under both symmetrical and asymmetrical smart grid faults. The proposed approach effectively incorporates cross-domain deep federated transfer learning (FTL) to leverage local operating data for TSA in a decentralized fashion. By introducing convolutional layers alongside multi-head attention mechanisms, the FLACON framework significantly improves learning efficiency across geographically distributed datasets. To address the challenge of class imbalance, the model integrates a balance factor-enhanced focal loss function. The FTL architecture enables decentralized model training across various clients, thus preserving data privacy and reducing the burden of communication overhead. To avoid the constant adjustment of hyperparameters, the FLACON employs an inductive transfer learning approach for hyperparameter tuning of the pre-trained model, markedly decreasing training time. Extensive experiments on datasets from the IEEE 39-bus system and the IEEE 68-bus system demonstrate FLACON's exceptional accuracy of 98.98% compared to some competitive alternatives.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
FLACON:对称和非对称电网故障期间的深度联邦传输学习瞬态稳定性评估
暂态稳定性评估(TSA)对于电力系统在严重故障条件下的可靠运行至关重要。在实践中,基于深度学习的 TSA 因其高精度而备受青睐,但往往忽略了在应对网络拓扑变化的同时维护数据隐私的挑战。本文提出了一种创新的基于焦点损耗的多头注意力卷积网络(FLACON),用于在对称和非对称智能电网故障下实现精确的扰动后 TSA。所提出的方法有效地结合了跨域深度联合转移学习(FTL),以分散的方式利用本地运行数据进行 TSA。通过引入卷积层和多头关注机制,FLACON 框架显著提高了跨地理分布数据集的学习效率。为了应对类不平衡的挑战,该模型集成了平衡因子增强的焦点损失函数。FTL 架构可在不同客户端之间实现分散模型训练,从而保护数据隐私并减少通信开销负担。为了避免不断调整超参数,FLACON 采用了归纳迁移学习方法来调整预训练模型的超参数,从而显著缩短了训练时间。在来自 IEEE 39-bus 系统和 IEEE 68-bus 系统的数据集上进行的大量实验表明,与一些竞争产品相比,FLACON 的准确率高达 98.98%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
13.50
自引率
0.00%
发文量
0
期刊最新文献
Strategy Optimization by Means of Evolutionary Algorithms With Multiple Closing Criteria for Energy Trading A SiC Based Two-Stage Pulsed Power Converter System for Laser Diode Driving and Other Pulsed Current Applications Magnetostriction Effect on Vibration and Acoustic Noise in Permanent Magnet Synchronous Motors Model Predictive Control in Multilevel Inverters Part II: Renewable Energies and Grid Applications Model Predictive Control in Multilevel Inverters Part I: Basic Strategy and Performance Improvement
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1