Coupled support tensor machine classification for multimodal neuroimaging data

L. Peide, Seyyid Emre Sofuoglu, T. Maiti, Selin Aviyente
{"title":"Coupled support tensor machine classification for multimodal neuroimaging data","authors":"L. Peide, Seyyid Emre Sofuoglu, T. Maiti, Selin Aviyente","doi":"10.1002/sam.11587","DOIUrl":null,"url":null,"abstract":"Multimodal data arise in various applications where information about the same phenomenon is acquired from multiple sensors and across different imaging modalities. Learning from multimodal data is of great interest in machine learning and statistics research as this offers the possibility of capturing complementary information among modalities. Multimodal modeling helps to explain the interdependence between heterogeneous data sources, discovers new insights that may not be available from a single modality, and improves decision‐making. Recently, coupled matrix–tensor factorization has been introduced for multimodal data fusion to jointly estimate latent factors and identify complex interdependence among the latent factors. However, most of the prior work on coupled matrix–tensor factors focuses on unsupervised learning and there is little work on supervised learning using the jointly estimated latent factors. This paper considers the multimodal tensor data classification problem. A coupled support tensor machine (C‐STM) built upon the latent factors jointly estimated from the advanced coupled matrix–tensor factorization is proposed. C‐STM combines individual and shared latent factors with multiple kernels and estimates a maximal‐margin classifier for coupled matrix–tensor data. The classification risk of C‐STM is shown to converge to the optimal Bayes risk, making it a statistically consistent rule. C‐STM is validated through simulation studies as well as a simultaneous analysis on electroencephalography with functional magnetic resonance imaging data. The empirical evidence shows that C‐STM can utilize information from multiple sources and provide a better classification performance than traditional single‐mode classifiers.","PeriodicalId":342679,"journal":{"name":"Statistical Analysis and Data Mining: The ASA Data Science Journal","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-01-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Statistical Analysis and Data Mining: The ASA Data Science Journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1002/sam.11587","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

Multimodal data arise in various applications where information about the same phenomenon is acquired from multiple sensors and across different imaging modalities. Learning from multimodal data is of great interest in machine learning and statistics research as this offers the possibility of capturing complementary information among modalities. Multimodal modeling helps to explain the interdependence between heterogeneous data sources, discovers new insights that may not be available from a single modality, and improves decision‐making. Recently, coupled matrix–tensor factorization has been introduced for multimodal data fusion to jointly estimate latent factors and identify complex interdependence among the latent factors. However, most of the prior work on coupled matrix–tensor factors focuses on unsupervised learning and there is little work on supervised learning using the jointly estimated latent factors. This paper considers the multimodal tensor data classification problem. A coupled support tensor machine (C‐STM) built upon the latent factors jointly estimated from the advanced coupled matrix–tensor factorization is proposed. C‐STM combines individual and shared latent factors with multiple kernels and estimates a maximal‐margin classifier for coupled matrix–tensor data. The classification risk of C‐STM is shown to converge to the optimal Bayes risk, making it a statistically consistent rule. C‐STM is validated through simulation studies as well as a simultaneous analysis on electroencephalography with functional magnetic resonance imaging data. The empirical evidence shows that C‐STM can utilize information from multiple sources and provide a better classification performance than traditional single‐mode classifiers.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
多模态神经影像数据的耦合支持张量机分类
多模态数据出现在各种应用中,其中从多个传感器和不同成像模式获取有关同一现象的信息。从多模态数据中学习在机器学习和统计研究中具有很大的兴趣,因为它提供了在模态之间捕获互补信息的可能性。多模态建模有助于解释异构数据源之间的相互依存关系,发现可能无法从单一模态获得的新见解,并改进决策。近年来,在多模态数据融合中引入了耦合矩阵-张量分解,以联合估计潜在因素并识别潜在因素之间复杂的相互依存关系。然而,以往关于矩阵-张量因子耦合的研究大多集中在无监督学习上,而利用联合估计的潜在因子进行监督学习的研究很少。本文研究了多模态张量数据分类问题。提出了一种基于高级耦合矩阵-张量分解联合估计的潜在因子的耦合支持张量机(C‐STM)。C - STM将单个和共享的潜在因素与多个核相结合,并估计耦合矩阵-张量数据的最大边际分类器。C‐STM的分类风险收敛于最优贝叶斯风险,使其成为统计上一致的规则。C‐STM通过模拟研究以及同时分析脑电与功能磁共振成像数据进行验证。经验证据表明,C - STM可以利用来自多个来源的信息,并提供比传统单模分类器更好的分类性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Neural interval‐censored survival regression with feature selection Bayesian batch optimization for molybdenum versus tungsten inertial confinement fusion double shell target design Gaussian process selections in semiparametric multi‐kernel machine regression for multi‐pathway analysis An automated alignment algorithm for identification of the source of footwear impressions with common class characteristics Confidence bounds for threshold similarity graph in random variable network
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1