Disease comorbidity—the co-occurrence of multiple diseases in the same individual—is increasingly prevalent and poses major clinical and biological challenges. Computational approaches for studying disease relationships and predicting comorbidity have evolved from overlap-based similarity measures to molecular network modeling and graph deep learning. However, existing methods often (i) learn global or subgraph-based disease embeddings without modeling the topology of fragmented disease subgraphs in a comorbidity-adaptive manner, or (ii) incorporate Gene Ontology (GO) information in ways that underutilize GO’s hierarchical ancestry and deeper functional abstractions. In this work, we propose DisSubFormer, a subgraph Transformer model for disease subgraph representation learning and comorbidity prediction. We first learn unified protein representations by integrating structural patterns from a PPI network with GO-aware functional information, explicitly incorporating GO’s hierarchical ancestry. We next sample biologically informed anchor patches in a property-aware manner to prioritize disease-relevant regions of the PPI network, replacing full-graph attention with subgraph-to-subgraph attention between disease subgraphs and these anchor patches to improve scalability and relevance. Specifically, DisSubFormer introduces a learnable multi-head attention mechanism where each head attends over a distinct anchor-patch type, with head-specific relational terms to capture complementary positional, neighborhood, and structural properties within fragmented disease subgraphs for comorbidity prediction. Experiments on a benchmark comorbidity dataset demonstrate that DisSubFormer consistently outperforms state-of-the-art methods, achieving an AUROC of 0.97.
扫码关注我们
求助内容:
应助结果提醒方式:
