Deep Hypercomplex Networks for Spatiotemporal Data Processing: Parameter efficiency and superior performance [Hypercomplex Signal and Image Processing]

IF 9.4 1区 工程技术 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC IEEE Signal Processing Magazine Pub Date : 2024-08-20 DOI:10.1109/MSP.2024.3381808
Alabi Bojesomo;Panos Liatsis;Hasan Al Marzouqi
{"title":"Deep Hypercomplex Networks for Spatiotemporal Data Processing: Parameter efficiency and superior performance [Hypercomplex Signal and Image Processing]","authors":"Alabi Bojesomo;Panos Liatsis;Hasan Al Marzouqi","doi":"10.1109/MSP.2024.3381808","DOIUrl":null,"url":null,"abstract":"Hypercomplex numbers, such as quaternions and octonions, have recently gained attention because of their advantageous properties over real numbers, e.g., in the development of parameter-efficient neural networks. For instance, the 16-component sedenion has the capacity to reduce the number of network parameters by a factor of 16. Moreover, hypercomplex neural networks offer advantages in the processing of spatiotemporal data as they are able to represent variable temporal data divisions through the hypercomplex components. Similarly, they support multimodal learning, with each component representing an individual modality. In this article, the key components of deep learning in the hypercomplex domain are introduced, encompassing concatenation, activation functions, convolution, and batch normalization. The use of the backpropagation algorithm for training hypercomplex networks is discussed in the context of hypercomplex algebra. These concepts are brought together in the design of a ResNet backbone using hypercomplex convolution, which is integrated within a U-Net configuration and applied in weather and traffic forecasting problems. The results demonstrate the superior performance of hypercomplex networks compared to their real-valued counterparts, given a fixed parameter budget, highlighting their potential in spatiotemporal data processing.","PeriodicalId":13246,"journal":{"name":"IEEE Signal Processing Magazine","volume":null,"pages":null},"PeriodicalIF":9.4000,"publicationDate":"2024-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Signal Processing Magazine","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10640320/","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Hypercomplex numbers, such as quaternions and octonions, have recently gained attention because of their advantageous properties over real numbers, e.g., in the development of parameter-efficient neural networks. For instance, the 16-component sedenion has the capacity to reduce the number of network parameters by a factor of 16. Moreover, hypercomplex neural networks offer advantages in the processing of spatiotemporal data as they are able to represent variable temporal data divisions through the hypercomplex components. Similarly, they support multimodal learning, with each component representing an individual modality. In this article, the key components of deep learning in the hypercomplex domain are introduced, encompassing concatenation, activation functions, convolution, and batch normalization. The use of the backpropagation algorithm for training hypercomplex networks is discussed in the context of hypercomplex algebra. These concepts are brought together in the design of a ResNet backbone using hypercomplex convolution, which is integrated within a U-Net configuration and applied in weather and traffic forecasting problems. The results demonstrate the superior performance of hypercomplex networks compared to their real-valued counterparts, given a fixed parameter budget, highlighting their potential in spatiotemporal data processing.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
用于时空数据处理的深度超复杂网络:参数效率和卓越性能[超复杂信号和图像处理]
超复数(如四元数和八元数)因其优于实数的特性而受到关注,例如在开发参数效率高的神经网络方面。例如,16 分量的四元数能够将网络参数数量减少 16 倍。此外,超复杂神经网络在处理时空数据方面具有优势,因为它们能够通过超复杂分量来表示可变的时间数据分部。同样,它们也支持多模态学习,每个分量代表一种单独的模态。本文介绍了超复杂领域深度学习的关键组件,包括连接、激活函数、卷积和批量归一化。在超复杂代数的背景下,讨论了使用反向传播算法训练超复杂网络的问题。在使用超复杂卷积设计 ResNet 骨干网时,将这些概念结合在一起,并将其集成到 U-Net 配置中,应用于天气和交通预报问题。结果表明,在参数预算固定的情况下,超复数网络的性能优于实值网络,突出了它们在时空数据处理方面的潜力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE Signal Processing Magazine
IEEE Signal Processing Magazine 工程技术-工程:电子与电气
CiteScore
27.20
自引率
0.70%
发文量
123
审稿时长
6-12 weeks
期刊介绍: EEE Signal Processing Magazine is a publication that focuses on signal processing research and applications. It publishes tutorial-style articles, columns, and forums that cover a wide range of topics related to signal processing. The magazine aims to provide the research, educational, and professional communities with the latest technical developments, issues, and events in the field. It serves as the main communication platform for the society, addressing important matters that concern all members.
期刊最新文献
Front Cover Table of Contents Masthead Special Issue: Artificial Intelligence for Education: A Signal Processing Perspective The Future of Bionic Limbs: The untapped synergy of signal processing, control, and wireless connectivity
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1