{"title":"A Linearly Convergent Optimization Framework for Learning Graphs From Smooth Signals","authors":"Xiaolu Wang;Chaorui Yao;Anthony Man-Cho So","doi":"10.1109/TSIPN.2023.3295770","DOIUrl":null,"url":null,"abstract":"Learning graph structures from a collection of smooth graph signals is a fundamental problem in data analysis and has attracted much interest in recent years. Although various optimization formulations of the problem have been proposed in the literature, existing methods for solving them either are not practically efficient or lack strong convergence guarantees. In this article, we consider a unified graph learning formulation that captures a wide range of static and time-varying graph learning models and develop a first-order method for solving it. By showing that the set of Karush-Kuhn-Tucker points of the formulation possesses a so-called \n<italic>error bound property</i>\n, we establish the linear convergence of our proposed method. Moreover, through extensive numerical experiments on both synthetic and real data, we show that our method exhibits sharp linear convergence and can be substantially faster than a host of other existing methods. To the best of our knowledge, our work is the first to develop a first-order method that not only is practically efficient but also enjoys a linear convergence guarantee when applied to a large class of graph learning models.","PeriodicalId":56268,"journal":{"name":"IEEE Transactions on Signal and Information Processing over Networks","volume":"9 ","pages":"490-504"},"PeriodicalIF":3.0000,"publicationDate":"2023-08-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Signal and Information Processing over Networks","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10214315/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Learning graph structures from a collection of smooth graph signals is a fundamental problem in data analysis and has attracted much interest in recent years. Although various optimization formulations of the problem have been proposed in the literature, existing methods for solving them either are not practically efficient or lack strong convergence guarantees. In this article, we consider a unified graph learning formulation that captures a wide range of static and time-varying graph learning models and develop a first-order method for solving it. By showing that the set of Karush-Kuhn-Tucker points of the formulation possesses a so-called
error bound property
, we establish the linear convergence of our proposed method. Moreover, through extensive numerical experiments on both synthetic and real data, we show that our method exhibits sharp linear convergence and can be substantially faster than a host of other existing methods. To the best of our knowledge, our work is the first to develop a first-order method that not only is practically efficient but also enjoys a linear convergence guarantee when applied to a large class of graph learning models.
期刊介绍:
The IEEE Transactions on Signal and Information Processing over Networks publishes high-quality papers that extend the classical notions of processing of signals defined over vector spaces (e.g. time and space) to processing of signals and information (data) defined over networks, potentially dynamically varying. In signal processing over networks, the topology of the network may define structural relationships in the data, or may constrain processing of the data. Topics include distributed algorithms for filtering, detection, estimation, adaptation and learning, model selection, data fusion, and diffusion or evolution of information over such networks, and applications of distributed signal processing.