Peng Zhang, Chaofei Gao, Zhuoyu Zhang, Zhiyuan Yuan, Qian Zhang, Ping Zhang, Shiyu Du, Weixun Zhou, Yan Li, Shao Li
{"title":"Systematic inference of super-resolution cell spatial profiles from histology images","authors":"Peng Zhang, Chaofei Gao, Zhuoyu Zhang, Zhiyuan Yuan, Qian Zhang, Ping Zhang, Shiyu Du, Weixun Zhou, Yan Li, Shao Li","doi":"10.1038/s41467-025-57072-6","DOIUrl":null,"url":null,"abstract":"<p>Inferring cell spatial profiles from histology images is critical for cancer diagnosis and treatment in clinical settings. In this study, we report a weakly-supervised deep-learning method, HistoCell, to directly infer super-resolution cell spatial profiles consisting of cell types, cell states and their spatial network from histology images at the single-nucleus-level. Benchmark analysis demonstrates that HistoCell robustly achieves state-of-the-art performance in terms of cell type/states prediction solely from histology images across multiple cancer tissues. HistoCell can significantly enhance the deconvolution accuracy for the spatial transcriptomics data and enable accurate annotation of subtle cancer tissue architectures. Moreover, HistoCell is applied to de novo discovery of clinically relevant spatial organization indicators, including prognosis and drug response biomarkers, across diverse cancer types. HistoCell also enable image-based screening of cell populations that drives phenotype of interest, and is applied to discover the cell population and corresponding spatial organization indicators associated with gastric malignant transformation risk. Overall, HistoCell emerges as a powerful and versatile tool for cancer studies in histology image-only cohorts.</p>","PeriodicalId":19066,"journal":{"name":"Nature Communications","volume":"31 1","pages":""},"PeriodicalIF":14.7000,"publicationDate":"2025-02-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Nature Communications","FirstCategoryId":"103","ListUrlMain":"https://doi.org/10.1038/s41467-025-57072-6","RegionNum":1,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
引用次数: 0
Abstract
Inferring cell spatial profiles from histology images is critical for cancer diagnosis and treatment in clinical settings. In this study, we report a weakly-supervised deep-learning method, HistoCell, to directly infer super-resolution cell spatial profiles consisting of cell types, cell states and their spatial network from histology images at the single-nucleus-level. Benchmark analysis demonstrates that HistoCell robustly achieves state-of-the-art performance in terms of cell type/states prediction solely from histology images across multiple cancer tissues. HistoCell can significantly enhance the deconvolution accuracy for the spatial transcriptomics data and enable accurate annotation of subtle cancer tissue architectures. Moreover, HistoCell is applied to de novo discovery of clinically relevant spatial organization indicators, including prognosis and drug response biomarkers, across diverse cancer types. HistoCell also enable image-based screening of cell populations that drives phenotype of interest, and is applied to discover the cell population and corresponding spatial organization indicators associated with gastric malignant transformation risk. Overall, HistoCell emerges as a powerful and versatile tool for cancer studies in histology image-only cohorts.
期刊介绍:
Nature Communications, an open-access journal, publishes high-quality research spanning all areas of the natural sciences. Papers featured in the journal showcase significant advances relevant to specialists in each respective field. With a 2-year impact factor of 16.6 (2022) and a median time of 8 days from submission to the first editorial decision, Nature Communications is committed to rapid dissemination of research findings. As a multidisciplinary journal, it welcomes contributions from biological, health, physical, chemical, Earth, social, mathematical, applied, and engineering sciences, aiming to highlight important breakthroughs within each domain.