Shyam Venkatasubramanian, Ali Pezeshki, Vahid Tarokh
{"title":"Steinmetz Neural Networks for Complex-Valued Data","authors":"Shyam Venkatasubramanian, Ali Pezeshki, Vahid Tarokh","doi":"arxiv-2409.10075","DOIUrl":null,"url":null,"abstract":"In this work, we introduce a new approach to processing complex-valued data\nusing DNNs consisting of parallel real-valued subnetworks with coupled outputs.\nOur proposed class of architectures, referred to as Steinmetz Neural Networks,\nleverages multi-view learning to construct more interpretable representations\nwithin the latent space. Subsequently, we present the Analytic Neural Network,\nwhich implements a consistency penalty that encourages analytic signal\nrepresentations in the Steinmetz neural network's latent space. This penalty\nenforces a deterministic and orthogonal relationship between the real and\nimaginary components. Utilizing an information-theoretic construction, we\ndemonstrate that the upper bound on the generalization error posited by the\nanalytic neural network is lower than that of the general class of Steinmetz\nneural networks. Our numerical experiments demonstrate the improved performance\nand robustness to additive noise, afforded by our proposed networks on\nbenchmark datasets and synthetic examples.","PeriodicalId":501347,"journal":{"name":"arXiv - CS - Neural and Evolutionary Computing","volume":"38 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Neural and Evolutionary Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.10075","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In this work, we introduce a new approach to processing complex-valued data
using DNNs consisting of parallel real-valued subnetworks with coupled outputs.
Our proposed class of architectures, referred to as Steinmetz Neural Networks,
leverages multi-view learning to construct more interpretable representations
within the latent space. Subsequently, we present the Analytic Neural Network,
which implements a consistency penalty that encourages analytic signal
representations in the Steinmetz neural network's latent space. This penalty
enforces a deterministic and orthogonal relationship between the real and
imaginary components. Utilizing an information-theoretic construction, we
demonstrate that the upper bound on the generalization error posited by the
analytic neural network is lower than that of the general class of Steinmetz
neural networks. Our numerical experiments demonstrate the improved performance
and robustness to additive noise, afforded by our proposed networks on
benchmark datasets and synthetic examples.