Hybrid multimodal fusion for graph learning in disease prediction

IF 4.2 3区 生物学 Q1 BIOCHEMICAL RESEARCH METHODS Methods Pub Date : 2024-06-14 DOI:10.1016/j.ymeth.2024.06.003
Ruomei Wang , Wei Guo , Yongjie Wang , Xin Zhou , Jonathan Cyril Leung , Shuo Yan , Lizhen Cui
{"title":"Hybrid multimodal fusion for graph learning in disease prediction","authors":"Ruomei Wang ,&nbsp;Wei Guo ,&nbsp;Yongjie Wang ,&nbsp;Xin Zhou ,&nbsp;Jonathan Cyril Leung ,&nbsp;Shuo Yan ,&nbsp;Lizhen Cui","doi":"10.1016/j.ymeth.2024.06.003","DOIUrl":null,"url":null,"abstract":"<div><p>Graph neural networks (GNNs) have gained significant attention in disease prediction where the latent embeddings of patients are modeled as nodes and the similarities among patients are represented through edges. The graph structure, which determines how information is aggregated and propagated, plays a crucial role in graph learning. Recent approaches typically create graphs based on patients' latent embeddings, which may not accurately reflect their real-world closeness. Our analysis reveals that raw data, such as demographic attributes and laboratory results, offers a wealth of information for assessing patient similarities and can serve as a compensatory measure for graphs constructed exclusively from latent embeddings. In this study, we first construct adaptive graphs from both latent representations and raw data respectively, and then merge these graphs via weighted summation. Given that the graphs may contain extraneous and noisy connections, we apply degree-sensitive edge pruning and kNN sparsification techniques to selectively sparsify and prune these edges. We conducted intensive experiments on two diagnostic prediction datasets, and the results demonstrate that our proposed method surpasses current state-of-the-art techniques.</p></div>","PeriodicalId":390,"journal":{"name":"Methods","volume":"229 ","pages":"Pages 41-48"},"PeriodicalIF":4.2000,"publicationDate":"2024-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Methods","FirstCategoryId":"99","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1046202324001531","RegionNum":3,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"BIOCHEMICAL RESEARCH METHODS","Score":null,"Total":0}
引用次数: 0

Abstract

Graph neural networks (GNNs) have gained significant attention in disease prediction where the latent embeddings of patients are modeled as nodes and the similarities among patients are represented through edges. The graph structure, which determines how information is aggregated and propagated, plays a crucial role in graph learning. Recent approaches typically create graphs based on patients' latent embeddings, which may not accurately reflect their real-world closeness. Our analysis reveals that raw data, such as demographic attributes and laboratory results, offers a wealth of information for assessing patient similarities and can serve as a compensatory measure for graphs constructed exclusively from latent embeddings. In this study, we first construct adaptive graphs from both latent representations and raw data respectively, and then merge these graphs via weighted summation. Given that the graphs may contain extraneous and noisy connections, we apply degree-sensitive edge pruning and kNN sparsification techniques to selectively sparsify and prune these edges. We conducted intensive experiments on two diagnostic prediction datasets, and the results demonstrate that our proposed method surpasses current state-of-the-art techniques.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
疾病预测中图谱学习的混合多模态融合。
图神经网络(GNNs)在疾病预测方面获得了极大关注,在图神经网络中,患者的潜在嵌入被建模为节点,患者之间的相似性通过边来表示。图结构决定了信息的聚合和传播方式,在图学习中起着至关重要的作用。最近的方法通常根据患者的潜在嵌入创建图,但这可能无法准确反映他们在现实世界中的亲密程度。我们的分析表明,人口统计学属性和化验结果等原始数据为评估患者的相似性提供了丰富的信息,可以作为完全由潜在嵌入构建的图的补偿措施。在本研究中,我们首先分别从潜在表征和原始数据中构建自适应图,然后通过加权求和合并这些图。考虑到图中可能包含无关的、有噪声的连接,我们采用了程度敏感的边缘剪切和 kNN 稀疏化技术来有选择性地稀疏和剪切这些边缘。我们在两个诊断预测数据集上进行了深入实验,结果表明我们提出的方法超越了当前最先进的技术。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Methods
Methods 生物-生化研究方法
CiteScore
9.80
自引率
2.10%
发文量
222
审稿时长
11.3 weeks
期刊介绍: Methods focuses on rapidly developing techniques in the experimental biological and medical sciences. Each topical issue, organized by a guest editor who is an expert in the area covered, consists solely of invited quality articles by specialist authors, many of them reviews. Issues are devoted to specific technical approaches with emphasis on clear detailed descriptions of protocols that allow them to be reproduced easily. The background information provided enables researchers to understand the principles underlying the methods; other helpful sections include comparisons of alternative methods giving the advantages and disadvantages of particular methods, guidance on avoiding potential pitfalls, and suggestions for troubleshooting.
期刊最新文献
A roadmap to cysteine specific labeling of membrane proteins for single-molecule photobleaching studies. In silico identification of Histone Deacetylase inhibitors using Streamlined Masked Transformer-based Pretrained features. Robust feature learning using contractive autoencoders for multi-omics clustering in cancer subtyping Optimizing Retinal Imaging: Evaluation of ultrasmall TiO2 nanoparticle- fluorescein conjugates for improved Fundus Fluorescein Angiography Ab-Amy 2.0: Predicting light chain amyloidogenic risk of therapeutic antibodies based on antibody language model
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1