{"title":"Label as Equilibrium: A performance booster for Graph Neural Networks on node classification","authors":"Yi Luo, Guangchun Luo, Guiduo Duan, Aiguo Chen","doi":"10.1016/j.neunet.2025.107284","DOIUrl":null,"url":null,"abstract":"<div><div>Graph Neural Network (GNN) is effective in graph mining and has become a dominant solution to the node classification task. Recently, a series of label reuse approaches emerged to boost the node classification performance of GNN. They repeatedly input the predicted node class labels into the underlying GNN to update the predictions. However, there are two issues in label reuse that prevent it from performing better. First, re-inputting predictions that are close to the training labels makes the GNN over-fitting, resulting in generalization loss and performance degradation. Second, the repeated iterations consume unaffordable memory for gradient descent, leading to compromised optimization and suboptimal results. To address these issues, we propose an advanced label reuse approach termed Label as Equilibrium (LaE). It has <strong>(1)</strong> an improved masking strategy with supervision concealment that resolves prediction over-fitting and <strong>(2)</strong> an infinite number of iterations which is optimizable within constant memory consumption. Excessive node classification experiments demonstrate the superiority of LaE. It significantly increases the accuracy scores of prevailing GNNs by 2.31% on average and outperforms previous label reuse approaches on eight real-world datasets by 1.60% on average. Considering the wide application of label reuse, many state-of-the-art GNNs can benefit from our techniques. Code to reproduce all our experiments is released at <span><span>https://github.com/cf020031308/LaE</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"186 ","pages":"Article 107284"},"PeriodicalIF":6.0000,"publicationDate":"2025-02-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025001637","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Graph Neural Network (GNN) is effective in graph mining and has become a dominant solution to the node classification task. Recently, a series of label reuse approaches emerged to boost the node classification performance of GNN. They repeatedly input the predicted node class labels into the underlying GNN to update the predictions. However, there are two issues in label reuse that prevent it from performing better. First, re-inputting predictions that are close to the training labels makes the GNN over-fitting, resulting in generalization loss and performance degradation. Second, the repeated iterations consume unaffordable memory for gradient descent, leading to compromised optimization and suboptimal results. To address these issues, we propose an advanced label reuse approach termed Label as Equilibrium (LaE). It has (1) an improved masking strategy with supervision concealment that resolves prediction over-fitting and (2) an infinite number of iterations which is optimizable within constant memory consumption. Excessive node classification experiments demonstrate the superiority of LaE. It significantly increases the accuracy scores of prevailing GNNs by 2.31% on average and outperforms previous label reuse approaches on eight real-world datasets by 1.60% on average. Considering the wide application of label reuse, many state-of-the-art GNNs can benefit from our techniques. Code to reproduce all our experiments is released at https://github.com/cf020031308/LaE.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.