Alternating nonnegative least squares-incorporated regularized symmetric latent factor analysis for undirected weighted networks

IF 6.5 2区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Neurocomputing Pub Date : 2024-11-28 Epub Date: 2024-08-25 DOI:10.1016/j.neucom.2024.128440
Yurong Zhong , Kechen Liu , Chen Jiqiu , Xie Zhe , Weiling Li
{"title":"Alternating nonnegative least squares-incorporated regularized symmetric latent factor analysis for undirected weighted networks","authors":"Yurong Zhong ,&nbsp;Kechen Liu ,&nbsp;Chen Jiqiu ,&nbsp;Xie Zhe ,&nbsp;Weiling Li","doi":"10.1016/j.neucom.2024.128440","DOIUrl":null,"url":null,"abstract":"<div><p>An <u>U</u>ndirected <u>W</u>eighted <u>N</u>etwork (UWN) can be precisely quantified as an adjacency matrix whose inherent characteristics are fully considered in a <u>S</u>ymmetric <u>N</u>onnegative <u>L</u>atent <u>F</u>actor (SNLF) model for its good representation accuracy. However, an SNLF model uses a sole latent factor matrix to precisely describe the topological characteristic of a UWN, i.e., symmetry, thereby impairing its representation learning ability. Aiming at addressing this issue, this paper proposes an <u>A</u>lternating nonnegative least squares-incorporated Regularized <u>S</u>ymmetric <u>L</u>atent factor analysis (ARSL) model. First of all, equation constraints composed of multiple matrices are built in its learning objective for well describing the symmetry of a UWN. Note that it adopts an <em>L</em><sub><em>2</em></sub>-norm-based regularization scheme to relax such constraints for making such a symmetry-aware learning objective solvable. Then, it designs an alternating nonnegative least squares-incorporated algorithm for optimizing its parameters efficiently. Empirical studies on four UWNs demonstrate that an ARSL model outperforms the state-of-the-art models in terms of representation accuracy, as well as achieves promising computational efficiency.</p></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"607 ","pages":"Article 128440"},"PeriodicalIF":6.5000,"publicationDate":"2024-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231224012116","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/8/25 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

An Undirected Weighted Network (UWN) can be precisely quantified as an adjacency matrix whose inherent characteristics are fully considered in a Symmetric Nonnegative Latent Factor (SNLF) model for its good representation accuracy. However, an SNLF model uses a sole latent factor matrix to precisely describe the topological characteristic of a UWN, i.e., symmetry, thereby impairing its representation learning ability. Aiming at addressing this issue, this paper proposes an Alternating nonnegative least squares-incorporated Regularized Symmetric Latent factor analysis (ARSL) model. First of all, equation constraints composed of multiple matrices are built in its learning objective for well describing the symmetry of a UWN. Note that it adopts an L2-norm-based regularization scheme to relax such constraints for making such a symmetry-aware learning objective solvable. Then, it designs an alternating nonnegative least squares-incorporated algorithm for optimizing its parameters efficiently. Empirical studies on four UWNs demonstrate that an ARSL model outperforms the state-of-the-art models in terms of representation accuracy, as well as achieves promising computational efficiency.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
无向加权网络的交替非负最小二乘法--纳入正则化的对称潜因子分析
无向加权网络(UWN)可精确量化为邻接矩阵,其固有特征在对称非负潜因(SNLF)模型中得到充分考虑,从而获得良好的表示精度。然而,SNLF 模型使用唯一的潜因矩阵来精确描述 UWN 的拓扑特征,即对称性,从而削弱了其表示学习能力。针对这一问题,本文提出了一种交替非负最小二乘法(Alternating nonnegative least squares-incorporated Regularized Symmetric Latent factor analysis,ARSL)模型。首先,在其学习目标中建立了由多个矩阵组成的等式约束,以很好地描述 UWN 的对称性。需要注意的是,它采用了一种基于 L2 规范的正则化方案来放松这些约束,从而使这种对称感知学习目标变得可解。然后,它设计了一种交替非负最小二乘法算法来有效优化其参数。对四种 UWN 的实证研究表明,ARSL 模型在表示精度方面优于最先进的模型,而且计算效率也很高。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Neurocomputing
Neurocomputing 工程技术-计算机:人工智能
CiteScore
13.10
自引率
10.00%
发文量
1382
审稿时长
70 days
期刊介绍: Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.
期刊最新文献
Optimized Bayesian asymmetric nutcracker quantized neural networks for area-efficient 10T2C capacitive SRAM design with reconfigurable SAR ADCs TOC-UCO: a comprehensive repository of tabular ordinal classification datasets Domain-consistent networks for cross-scene hyperspectral image classification Fuzzy observer-based event-triggered sliding mode control for delayed networked T-S fuzzy systems with dissipativity guarantees: A delta operator approach Dynamic prompt-enhanced multimodal dual-branch few-shot time series forecasting
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1