{"title":"GNNs' Generalization Improvement for Large-Scale Power System Analysis Based on Physics-Informed Self-Supervised Pre-Training","authors":"Yuhong Zhu;Yongzhi Zhou;Wei Wei;Peng Li;Wenqi Huang","doi":"10.1109/TPWRS.2025.3544312","DOIUrl":null,"url":null,"abstract":"Efficient and informative representation of system topologies is critical in AI-driven power system analysis (PSA). Despite a major breakthrough, recent approaches employing Graph Neural Networks (GNNs) face significant challenges in large-scale PSA, including high computational demands for sufficient labeled data and poor generalization to unseen disturbed topologies. To tackle these issues, we propose a self-supervised strategy for pre-training GNNs that enhances their expressiveness at both the individual node feature level and the whole graph structure. Integrating physics-informed techniques, our strategy allows GNNs to internalize fundamental principles applicable to multiple downstream tasks. We demonstrate that our method enables the efficient training of GNNs on extensive topology datasets without supervision, effectively addressing the noted challenges. By pre-training GNNs with 145 million parameters on 20 million unlabeled topologies and subsequently fine-tuning them, we observe a significant performance improvement, averaging over 13%, compared to existing state-of-the-art (SOTA) methods across four challenging tasks.","PeriodicalId":13373,"journal":{"name":"IEEE Transactions on Power Systems","volume":"40 5","pages":"4145-4157"},"PeriodicalIF":7.2000,"publicationDate":"2025-02-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Power Systems","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10901974/","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Efficient and informative representation of system topologies is critical in AI-driven power system analysis (PSA). Despite a major breakthrough, recent approaches employing Graph Neural Networks (GNNs) face significant challenges in large-scale PSA, including high computational demands for sufficient labeled data and poor generalization to unseen disturbed topologies. To tackle these issues, we propose a self-supervised strategy for pre-training GNNs that enhances their expressiveness at both the individual node feature level and the whole graph structure. Integrating physics-informed techniques, our strategy allows GNNs to internalize fundamental principles applicable to multiple downstream tasks. We demonstrate that our method enables the efficient training of GNNs on extensive topology datasets without supervision, effectively addressing the noted challenges. By pre-training GNNs with 145 million parameters on 20 million unlabeled topologies and subsequently fine-tuning them, we observe a significant performance improvement, averaging over 13%, compared to existing state-of-the-art (SOTA) methods across four challenging tasks.
期刊介绍:
The scope of IEEE Transactions on Power Systems covers the education, analysis, operation, planning, and economics of electric generation, transmission, and distribution systems for general industrial, commercial, public, and domestic consumption, including the interaction with multi-energy carriers. The focus of this transactions is the power system from a systems viewpoint instead of components of the system. It has five (5) key areas within its scope with several technical topics within each area. These areas are: (1) Power Engineering Education, (2) Power System Analysis, Computing, and Economics, (3) Power System Dynamic Performance, (4) Power System Operations, and (5) Power System Planning and Implementation.