Asim Waqas, Aakash Tripathi, Paul Stewart, Mia Naeini, Ghulam Rasool
{"title":"基于嵌入式多模态学习的泛鳞状细胞癌改善生存结果","authors":"Asim Waqas, Aakash Tripathi, Paul Stewart, Mia Naeini, Ghulam Rasool","doi":"arxiv-2406.08521","DOIUrl":null,"url":null,"abstract":"Cancer clinics capture disease data at various scales, from genetic to organ\nlevel. Current bioinformatic methods struggle to handle the heterogeneous\nnature of this data, especially with missing modalities. We propose PARADIGM, a\nGraph Neural Network (GNN) framework that learns from multimodal, heterogeneous\ndatasets to improve clinical outcome prediction. PARADIGM generates embeddings\nfrom multi-resolution data using foundation models, aggregates them into\npatient-level representations, fuses them into a unified graph, and enhances\nperformance for tasks like survival analysis. We train GNNs on pan-Squamous\nCell Carcinomas and validate our approach on Moffitt Cancer Center lung SCC\ndata. Multimodal GNN outperforms other models in patient survival prediction.\nConverging individual data modalities across varying scales provides a more\ninsightful disease view. Our solution aims to understand the patient's\ncircumstances comprehensively, offering insights on heterogeneous data\nintegration and the benefits of converging maximum data views.","PeriodicalId":501321,"journal":{"name":"arXiv - QuanBio - Cell Behavior","volume":"65 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-06-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Embedding-based Multimodal Learning on Pan-Squamous Cell Carcinomas for Improved Survival Outcomes\",\"authors\":\"Asim Waqas, Aakash Tripathi, Paul Stewart, Mia Naeini, Ghulam Rasool\",\"doi\":\"arxiv-2406.08521\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Cancer clinics capture disease data at various scales, from genetic to organ\\nlevel. Current bioinformatic methods struggle to handle the heterogeneous\\nnature of this data, especially with missing modalities. We propose PARADIGM, a\\nGraph Neural Network (GNN) framework that learns from multimodal, heterogeneous\\ndatasets to improve clinical outcome prediction. PARADIGM generates embeddings\\nfrom multi-resolution data using foundation models, aggregates them into\\npatient-level representations, fuses them into a unified graph, and enhances\\nperformance for tasks like survival analysis. We train GNNs on pan-Squamous\\nCell Carcinomas and validate our approach on Moffitt Cancer Center lung SCC\\ndata. Multimodal GNN outperforms other models in patient survival prediction.\\nConverging individual data modalities across varying scales provides a more\\ninsightful disease view. Our solution aims to understand the patient's\\ncircumstances comprehensively, offering insights on heterogeneous data\\nintegration and the benefits of converging maximum data views.\",\"PeriodicalId\":501321,\"journal\":{\"name\":\"arXiv - QuanBio - Cell Behavior\",\"volume\":\"65 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-06-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - QuanBio - Cell Behavior\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2406.08521\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - QuanBio - Cell Behavior","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2406.08521","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Embedding-based Multimodal Learning on Pan-Squamous Cell Carcinomas for Improved Survival Outcomes
Cancer clinics capture disease data at various scales, from genetic to organ
level. Current bioinformatic methods struggle to handle the heterogeneous
nature of this data, especially with missing modalities. We propose PARADIGM, a
Graph Neural Network (GNN) framework that learns from multimodal, heterogeneous
datasets to improve clinical outcome prediction. PARADIGM generates embeddings
from multi-resolution data using foundation models, aggregates them into
patient-level representations, fuses them into a unified graph, and enhances
performance for tasks like survival analysis. We train GNNs on pan-Squamous
Cell Carcinomas and validate our approach on Moffitt Cancer Center lung SCC
data. Multimodal GNN outperforms other models in patient survival prediction.
Converging individual data modalities across varying scales provides a more
insightful disease view. Our solution aims to understand the patient's
circumstances comprehensively, offering insights on heterogeneous data
integration and the benefits of converging maximum data views.