{"title":"Automated Loss function Search for Class-imbalanced Node Classification","authors":"Xinyu Guo, Kai Wu, Xiaoyu Zhang, Jing Liu","doi":"arxiv-2405.14133","DOIUrl":null,"url":null,"abstract":"Class-imbalanced node classification tasks are prevalent in real-world\nscenarios. Due to the uneven distribution of nodes across different classes,\nlearning high-quality node representations remains a challenging endeavor. The\nengineering of loss functions has shown promising potential in addressing this\nissue. It involves the meticulous design of loss functions, utilizing\ninformation about the quantities of nodes in different categories and the\nnetwork's topology to learn unbiased node representations. However, the design\nof these loss functions heavily relies on human expert knowledge and exhibits\nlimited adaptability to specific target tasks. In this paper, we introduce a\nhigh-performance, flexible, and generalizable automated loss function search\nframework to tackle this challenge. Across 15 combinations of graph neural\nnetworks and datasets, our framework achieves a significant improvement in\nperformance compared to state-of-the-art methods. Additionally, we observe that\nhomophily in graph-structured data significantly contributes to the\ntransferability of the proposed framework.","PeriodicalId":501033,"journal":{"name":"arXiv - CS - Symbolic Computation","volume":"48 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Symbolic Computation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2405.14133","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Class-imbalanced node classification tasks are prevalent in real-world
scenarios. Due to the uneven distribution of nodes across different classes,
learning high-quality node representations remains a challenging endeavor. The
engineering of loss functions has shown promising potential in addressing this
issue. It involves the meticulous design of loss functions, utilizing
information about the quantities of nodes in different categories and the
network's topology to learn unbiased node representations. However, the design
of these loss functions heavily relies on human expert knowledge and exhibits
limited adaptability to specific target tasks. In this paper, we introduce a
high-performance, flexible, and generalizable automated loss function search
framework to tackle this challenge. Across 15 combinations of graph neural
networks and datasets, our framework achieves a significant improvement in
performance compared to state-of-the-art methods. Additionally, we observe that
homophily in graph-structured data significantly contributes to the
transferability of the proposed framework.