用于多重分类的低阶支持张量机

IF 8.1 1区 计算机科学 N/A COMPUTER SCIENCE, INFORMATION SYSTEMS Information Sciences Pub Date : 2024-08-28 DOI:10.1016/j.ins.2024.121398
{"title":"用于多重分类的低阶支持张量机","authors":"","doi":"10.1016/j.ins.2024.121398","DOIUrl":null,"url":null,"abstract":"<div><p>In recent decades, there has been an increasing demand for effectively handling high-dimensional multi-channel tensor data. Due to the inability to utilize internal structural information, Support Vector Machine (SVM) and its variations struggle to classify flattened tensor data, consequently resulting in the ‘curse of dimensionality’ issue. Furthermore, most of these methods can not directly apply to multiclass datasets. To overcome these challenges, we have developed a novel classification method called Multiclass Low-Rank Support Tensor Machine (MLRSTM). Our method is inspired by the well-established low-rank tensor hypothesis, which suggests a correlation between each channel of the feature tensor. Specifically, MLRSTM adopts the hinge loss function and introduces a convex approximation of tensor rank, the order-<em>d</em> Tensor Nuclear Norm (order-<em>d</em> TNN), in the regularization term. By leveraging the order-<em>d</em> TNN, MLRSTM effectively exploits the inherent structural information in tensor data to enhance generalization performance and avoid the curse of dimensionality. Moreover, we develop the Alternating Direction Method of Multipliers (ADMM) algorithm to optimize the convex problem inherent in training MLRSTM. Finally, comprehensive experiments validate the excellent performance of MLRSTM in tensor multi-classification tasks, showcasing its potential and efficacy in handling high-dimensional multi-channel tensor data.</p></div>","PeriodicalId":51063,"journal":{"name":"Information Sciences","volume":null,"pages":null},"PeriodicalIF":8.1000,"publicationDate":"2024-08-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A low-rank support tensor machine for multi-classification\",\"authors\":\"\",\"doi\":\"10.1016/j.ins.2024.121398\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>In recent decades, there has been an increasing demand for effectively handling high-dimensional multi-channel tensor data. Due to the inability to utilize internal structural information, Support Vector Machine (SVM) and its variations struggle to classify flattened tensor data, consequently resulting in the ‘curse of dimensionality’ issue. Furthermore, most of these methods can not directly apply to multiclass datasets. To overcome these challenges, we have developed a novel classification method called Multiclass Low-Rank Support Tensor Machine (MLRSTM). Our method is inspired by the well-established low-rank tensor hypothesis, which suggests a correlation between each channel of the feature tensor. Specifically, MLRSTM adopts the hinge loss function and introduces a convex approximation of tensor rank, the order-<em>d</em> Tensor Nuclear Norm (order-<em>d</em> TNN), in the regularization term. By leveraging the order-<em>d</em> TNN, MLRSTM effectively exploits the inherent structural information in tensor data to enhance generalization performance and avoid the curse of dimensionality. Moreover, we develop the Alternating Direction Method of Multipliers (ADMM) algorithm to optimize the convex problem inherent in training MLRSTM. Finally, comprehensive experiments validate the excellent performance of MLRSTM in tensor multi-classification tasks, showcasing its potential and efficacy in handling high-dimensional multi-channel tensor data.</p></div>\",\"PeriodicalId\":51063,\"journal\":{\"name\":\"Information Sciences\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":8.1000,\"publicationDate\":\"2024-08-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Information Sciences\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0020025524013124\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"N/A\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Sciences","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0020025524013124","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"N/A","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

近几十年来,有效处理高维多通道张量数据的需求日益增长。由于无法利用内部结构信息,支持向量机(SVM)及其变体很难对扁平化的张量数据进行分类,从而导致了 "维度诅咒 "问题。此外,这些方法大多不能直接应用于多类数据集。为了克服这些挑战,我们开发了一种新的分类方法,称为多类低张量支持张量机(MLRSTM)。我们的方法受成熟的低阶张量假说启发,该假说认为特征张量的每个通道之间存在相关性。具体来说,MLRSTM 采用了铰链损失函数,并在正则项中引入了张量秩的凸近似值--阶-d 张量核规范(阶-d TNN)。通过利用阶d TNN,MLRSTM 有效地利用了张量数据的固有结构信息,从而提高了泛化性能,避免了维度诅咒。此外,我们还开发了交替方向乘法(ADMM)算法,以优化训练 MLRSTM 所固有的凸问题。最后,综合实验验证了 MLRSTM 在张量多分类任务中的卓越性能,展示了它在处理高维多通道张量数据方面的潜力和功效。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
A low-rank support tensor machine for multi-classification

In recent decades, there has been an increasing demand for effectively handling high-dimensional multi-channel tensor data. Due to the inability to utilize internal structural information, Support Vector Machine (SVM) and its variations struggle to classify flattened tensor data, consequently resulting in the ‘curse of dimensionality’ issue. Furthermore, most of these methods can not directly apply to multiclass datasets. To overcome these challenges, we have developed a novel classification method called Multiclass Low-Rank Support Tensor Machine (MLRSTM). Our method is inspired by the well-established low-rank tensor hypothesis, which suggests a correlation between each channel of the feature tensor. Specifically, MLRSTM adopts the hinge loss function and introduces a convex approximation of tensor rank, the order-d Tensor Nuclear Norm (order-d TNN), in the regularization term. By leveraging the order-d TNN, MLRSTM effectively exploits the inherent structural information in tensor data to enhance generalization performance and avoid the curse of dimensionality. Moreover, we develop the Alternating Direction Method of Multipliers (ADMM) algorithm to optimize the convex problem inherent in training MLRSTM. Finally, comprehensive experiments validate the excellent performance of MLRSTM in tensor multi-classification tasks, showcasing its potential and efficacy in handling high-dimensional multi-channel tensor data.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Information Sciences
Information Sciences 工程技术-计算机:信息系统
CiteScore
14.00
自引率
17.30%
发文量
1322
审稿时长
10.4 months
期刊介绍: Informatics and Computer Science Intelligent Systems Applications is an esteemed international journal that focuses on publishing original and creative research findings in the field of information sciences. We also feature a limited number of timely tutorial and surveying contributions. Our journal aims to cater to a diverse audience, including researchers, developers, managers, strategic planners, graduate students, and anyone interested in staying up-to-date with cutting-edge research in information science, knowledge engineering, and intelligent systems. While readers are expected to share a common interest in information science, they come from varying backgrounds such as engineering, mathematics, statistics, physics, computer science, cell biology, molecular biology, management science, cognitive science, neurobiology, behavioral sciences, and biochemistry.
期刊最新文献
Ex-RL: Experience-based reinforcement learning Editorial Board Joint consensus kernel learning and adaptive hypergraph regularization for graph-based clustering RT-DIFTWD: A novel data-driven intuitionistic fuzzy three-way decision model with regret theory Granular correlation-based label-specific feature augmentation for multi-label classification
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1