通过 t-HGSP 从数据中学习超图张量表示

IF 3 3区 计算机科学 Q2 ENGINEERING, ELECTRICAL & ELECTRONIC IEEE Transactions on Signal and Information Processing over Networks Pub Date : 2023-12-20 DOI:10.1109/TSIPN.2023.3345142
Karelia Pena-Pena;Lucas Taipe;Fuli Wang;Daniel L. Lau;Gonzalo R. Arce
{"title":"通过 t-HGSP 从数据中学习超图张量表示","authors":"Karelia Pena-Pena;Lucas Taipe;Fuli Wang;Daniel L. Lau;Gonzalo R. Arce","doi":"10.1109/TSIPN.2023.3345142","DOIUrl":null,"url":null,"abstract":"Representation learning considering high-order relationships in data has recently shown to be advantageous in many applications. The construction of a meaningful hypergraph plays a crucial role in the success of hypergraph-based representation learning methods, which is particularly useful in hypergraph neural networks and hypergraph signal processing. However, a meaningful hypergraph may only be available in specific cases. This paper addresses the challenge of learning the underlying hypergraph topology from the data itself. As in graph signal processing applications, we consider the case in which the data possesses certain regularity or smoothness on the hypergraph. To this end, our method builds on the novel tensor-based hypergraph signal processing framework (t-HGSP) that has recently emerged as a powerful tool for preserving the intrinsic high-order structure of data on hypergraphs. Given the hypergraph spectrum and frequency coefficient definitions within the t-HGSP framework, we propose a method to learn the hypergraph Laplacian from data by minimizing the total variation on the hypergraph (TVL-HGSP). Additionally, we introduce an alternative approach (PDL-HGSP) that improves the connectivity of the learned hypergraph without compromising sparsity and use primal-dual-based algorithms to reduce the computational complexity. Finally, we combine the proposed learning algorithms with novel tensor-based hypergraph convolutional neural networks to propose hypergraph learning-convolutional neural networks (t-HyperGLNN).","PeriodicalId":56268,"journal":{"name":"IEEE Transactions on Signal and Information Processing over Networks","volume":"10 ","pages":"17-31"},"PeriodicalIF":3.0000,"publicationDate":"2023-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Learning Hypergraphs Tensor Representations From Data via t-HGSP\",\"authors\":\"Karelia Pena-Pena;Lucas Taipe;Fuli Wang;Daniel L. Lau;Gonzalo R. Arce\",\"doi\":\"10.1109/TSIPN.2023.3345142\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Representation learning considering high-order relationships in data has recently shown to be advantageous in many applications. The construction of a meaningful hypergraph plays a crucial role in the success of hypergraph-based representation learning methods, which is particularly useful in hypergraph neural networks and hypergraph signal processing. However, a meaningful hypergraph may only be available in specific cases. This paper addresses the challenge of learning the underlying hypergraph topology from the data itself. As in graph signal processing applications, we consider the case in which the data possesses certain regularity or smoothness on the hypergraph. To this end, our method builds on the novel tensor-based hypergraph signal processing framework (t-HGSP) that has recently emerged as a powerful tool for preserving the intrinsic high-order structure of data on hypergraphs. Given the hypergraph spectrum and frequency coefficient definitions within the t-HGSP framework, we propose a method to learn the hypergraph Laplacian from data by minimizing the total variation on the hypergraph (TVL-HGSP). Additionally, we introduce an alternative approach (PDL-HGSP) that improves the connectivity of the learned hypergraph without compromising sparsity and use primal-dual-based algorithms to reduce the computational complexity. Finally, we combine the proposed learning algorithms with novel tensor-based hypergraph convolutional neural networks to propose hypergraph learning-convolutional neural networks (t-HyperGLNN).\",\"PeriodicalId\":56268,\"journal\":{\"name\":\"IEEE Transactions on Signal and Information Processing over Networks\",\"volume\":\"10 \",\"pages\":\"17-31\"},\"PeriodicalIF\":3.0000,\"publicationDate\":\"2023-12-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Signal and Information Processing over Networks\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10366871/\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Signal and Information Processing over Networks","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10366871/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

摘要

考虑到数据中高阶关系的表征学习最近在许多应用中显示出优势。构建有意义的超图对基于超图的表征学习方法的成功起着至关重要的作用,这在超图神经网络和超图信号处理中尤其有用。然而,有意义的超图可能只在特定情况下才存在。本文探讨了从数据本身学习底层超图拓扑的挑战。与图信号处理应用一样,我们考虑的情况是数据在超图上具有一定的规则性或平滑性。为此,我们的方法建立在新颖的基于张量的超图信号处理框架(t-HGSP)之上,该框架最近已成为保存超图数据内在高阶结构的有力工具。考虑到 t-HGSP 框架中的超图频谱和频率系数定义,我们提出了一种通过最小化超图上的总变化(TVL-HGSP)从数据中学习超图拉普拉斯的方法。此外,我们还引入了另一种方法(PDL-HGSP),在不影响稀疏性的情况下提高学习到的超图的连通性,并使用基于基元二元的算法来降低计算复杂性。最后,我们将所提出的学习算法与基于张量的新型超图卷积神经网络相结合,提出了超图学习-卷积神经网络(t-HyperGLNN)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Learning Hypergraphs Tensor Representations From Data via t-HGSP
Representation learning considering high-order relationships in data has recently shown to be advantageous in many applications. The construction of a meaningful hypergraph plays a crucial role in the success of hypergraph-based representation learning methods, which is particularly useful in hypergraph neural networks and hypergraph signal processing. However, a meaningful hypergraph may only be available in specific cases. This paper addresses the challenge of learning the underlying hypergraph topology from the data itself. As in graph signal processing applications, we consider the case in which the data possesses certain regularity or smoothness on the hypergraph. To this end, our method builds on the novel tensor-based hypergraph signal processing framework (t-HGSP) that has recently emerged as a powerful tool for preserving the intrinsic high-order structure of data on hypergraphs. Given the hypergraph spectrum and frequency coefficient definitions within the t-HGSP framework, we propose a method to learn the hypergraph Laplacian from data by minimizing the total variation on the hypergraph (TVL-HGSP). Additionally, we introduce an alternative approach (PDL-HGSP) that improves the connectivity of the learned hypergraph without compromising sparsity and use primal-dual-based algorithms to reduce the computational complexity. Finally, we combine the proposed learning algorithms with novel tensor-based hypergraph convolutional neural networks to propose hypergraph learning-convolutional neural networks (t-HyperGLNN).
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Transactions on Signal and Information Processing over Networks
IEEE Transactions on Signal and Information Processing over Networks Computer Science-Computer Networks and Communications
CiteScore
5.80
自引率
12.50%
发文量
56
期刊介绍: The IEEE Transactions on Signal and Information Processing over Networks publishes high-quality papers that extend the classical notions of processing of signals defined over vector spaces (e.g. time and space) to processing of signals and information (data) defined over networks, potentially dynamically varying. In signal processing over networks, the topology of the network may define structural relationships in the data, or may constrain processing of the data. Topics include distributed algorithms for filtering, detection, estimation, adaptation and learning, model selection, data fusion, and diffusion or evolution of information over such networks, and applications of distributed signal processing.
期刊最新文献
Reinforcement Learning-Based Event-Triggered Constrained Containment Control for Perturbed Multiagent Systems Finite-Time Performance Mask Function-Based Distributed Privacy-Preserving Consensus: Case Study on Optimal Dispatch of Energy System Discrete-Time Controllability of Cartesian Product Networks Generalized Simplicial Attention Neural Networks A Continuous-Time Algorithm for Distributed Optimization With Nonuniform Time-Delay Under Switching and Unbalanced Digraphs
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1