Evaluation of tumor budding with virtual panCK stains generated by novel multi-model CNN framework

IF 4.9 2区 医学 Q1 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS Computer methods and programs in biomedicine Pub Date : 2024-08-22 DOI:10.1016/j.cmpb.2024.108352
Xingzhong Hou , Zhen Guan , Xianwei Zhang , Xiao Hu , Shuangmei Zou , Chunzi Liang , Lulin Shi , Kaitai Zhang , Haihang You
{"title":"Evaluation of tumor budding with virtual panCK stains generated by novel multi-model CNN framework","authors":"Xingzhong Hou ,&nbsp;Zhen Guan ,&nbsp;Xianwei Zhang ,&nbsp;Xiao Hu ,&nbsp;Shuangmei Zou ,&nbsp;Chunzi Liang ,&nbsp;Lulin Shi ,&nbsp;Kaitai Zhang ,&nbsp;Haihang You","doi":"10.1016/j.cmpb.2024.108352","DOIUrl":null,"url":null,"abstract":"<div><p>As the global incidence of cancer continues to rise rapidly, the need for swift and precise diagnoses has become increasingly pressing. Pathologists commonly rely on H&amp;E-panCK stain pairs for various aspects of cancer diagnosis, including the detection of occult tumor cells and the evaluation of tumor budding. Nevertheless, conventional chemical staining methods suffer from notable drawbacks, such as time-intensive processes and irreversible staining outcomes. The virtual stain technique, leveraging generative adversarial network (GAN), has emerged as a promising alternative to chemical stains. This approach aims to transform biopsy scans (often H&amp;E) into other stain types. Despite achieving notable progress in recent years, current state-of-the-art virtual staining models confront challenges that hinder their efficacy, particularly in achieving accurate staining outcomes under specific conditions. These limitations have impeded the practical integration of virtual staining into diagnostic practices. To address the goal of producing virtual panCK stains capable of replacing chemical panCK, we propose an innovative multi-model framework. Our approach involves employing a combination of Mask-RCNN (for cell segmentation) and GAN models to extract cytokeratin distribution from chemical H&amp;E images. Additionally, we introduce a tailored dynamic GAN model to convert H&amp;E images into virtual panCK stains, integrating the derived cytokeratin distribution. Our framework is motivated by the fact that the unique pattern of the panCK is derived from cytokeratin distribution. As a proof of concept, we employ our virtual panCK stains to evaluate tumor budding in 45 H&amp;E whole-slide images taken from breast cancer-invaded lymph nodes . Through thorough validation by both pathologists and the QuPath software, our virtual panCK stains demonstrate a remarkable level of accuracy. In stark contrast, the accuracy of state-of-the-art single cycleGAN virtual panCK stains is negligible. To our best knowledge, this is the first instance of a multi-model virtual panCK framework and the utilization of virtual panCK for tumor budding assessment. Our framework excels in generating dependable virtual panCK stains with significantly improved efficiency, thereby considerably reducing turnaround times in diagnosis. Furthermore, its outcomes are easily comprehensible even to pathologists who may not be well-versed in computer technology. We firmly believe that our framework has the potential to advance the field of virtual stain, thereby making significant strides towards improved cancer diagnosis.</p></div>","PeriodicalId":10624,"journal":{"name":"Computer methods and programs in biomedicine","volume":"257 ","pages":"Article 108352"},"PeriodicalIF":4.9000,"publicationDate":"2024-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer methods and programs in biomedicine","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0169260724003456","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

Abstract

As the global incidence of cancer continues to rise rapidly, the need for swift and precise diagnoses has become increasingly pressing. Pathologists commonly rely on H&E-panCK stain pairs for various aspects of cancer diagnosis, including the detection of occult tumor cells and the evaluation of tumor budding. Nevertheless, conventional chemical staining methods suffer from notable drawbacks, such as time-intensive processes and irreversible staining outcomes. The virtual stain technique, leveraging generative adversarial network (GAN), has emerged as a promising alternative to chemical stains. This approach aims to transform biopsy scans (often H&E) into other stain types. Despite achieving notable progress in recent years, current state-of-the-art virtual staining models confront challenges that hinder their efficacy, particularly in achieving accurate staining outcomes under specific conditions. These limitations have impeded the practical integration of virtual staining into diagnostic practices. To address the goal of producing virtual panCK stains capable of replacing chemical panCK, we propose an innovative multi-model framework. Our approach involves employing a combination of Mask-RCNN (for cell segmentation) and GAN models to extract cytokeratin distribution from chemical H&E images. Additionally, we introduce a tailored dynamic GAN model to convert H&E images into virtual panCK stains, integrating the derived cytokeratin distribution. Our framework is motivated by the fact that the unique pattern of the panCK is derived from cytokeratin distribution. As a proof of concept, we employ our virtual panCK stains to evaluate tumor budding in 45 H&E whole-slide images taken from breast cancer-invaded lymph nodes . Through thorough validation by both pathologists and the QuPath software, our virtual panCK stains demonstrate a remarkable level of accuracy. In stark contrast, the accuracy of state-of-the-art single cycleGAN virtual panCK stains is negligible. To our best knowledge, this is the first instance of a multi-model virtual panCK framework and the utilization of virtual panCK for tumor budding assessment. Our framework excels in generating dependable virtual panCK stains with significantly improved efficiency, thereby considerably reducing turnaround times in diagnosis. Furthermore, its outcomes are easily comprehensible even to pathologists who may not be well-versed in computer technology. We firmly believe that our framework has the potential to advance the field of virtual stain, thereby making significant strides towards improved cancer diagnosis.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
利用新型多模型 CNN 框架生成的虚拟 panCK 染色体评估肿瘤出芽情况。
随着全球癌症发病率的持续快速上升,对快速、精确诊断的需求日益迫切。病理学家通常依靠 H&E-panCK 染色对来进行癌症诊断的各个方面,包括检测隐匿的肿瘤细胞和评估肿瘤萌芽。然而,传统的化学染色方法存在明显的缺点,如过程耗时长、染色结果不可逆等。利用生成式对抗网络(GAN)的虚拟染色技术已成为化学染色的一种有前途的替代方法。这种方法旨在将活检扫描(通常为 H&E)转化为其他染色类型。尽管近年来取得了显著进展,但目前最先进的虚拟染色模型仍面临着阻碍其功效的挑战,尤其是在特定条件下实现准确染色结果方面。这些局限性阻碍了虚拟染色与诊断实践的实际结合。为了实现制作能够替代化学 panCK 的虚拟 panCK 染色的目标,我们提出了一种创新的多模型框架。我们的方法是结合使用 Mask-RCNN(用于细胞分割)和 GAN 模型,从化学 H&E 图像中提取细胞角蛋白分布。此外,我们还引入了一个量身定制的动态 GAN 模型,将 H&E 图像转换成虚拟的 PanCK 染色图,并将得出的细胞角蛋白分布进行整合。panCK的独特模式源自细胞角蛋白分布,这一事实激发了我们的框架。作为概念验证,我们利用我们的虚拟 panCK 染色技术,对 45 张取自乳腺癌侵犯淋巴结的 H&E 全切片图像中的肿瘤出芽情况进行了评估。通过病理学家和 QuPath 软件的全面验证,我们的虚拟 panCK 染色法显示出了非凡的准确性。与此形成鲜明对比的是,最先进的单周期基因组学虚拟 panCK 染色的准确性几乎可以忽略不计。据我们所知,这是首个多模型虚拟 panCK 框架和利用虚拟 panCK 评估肿瘤萌芽的实例。我们的框架在生成可靠的虚拟 panCK 染色方面表现出色,效率显著提高,从而大大缩短了诊断周转时间。此外,即使是不精通计算机技术的病理学家也能轻松理解其结果。我们坚信,我们的框架有望推动虚拟染色领域的发展,从而在改善癌症诊断方面取得重大进展。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Computer methods and programs in biomedicine
Computer methods and programs in biomedicine 工程技术-工程:生物医学
CiteScore
12.30
自引率
6.60%
发文量
601
审稿时长
135 days
期刊介绍: To encourage the development of formal computing methods, and their application in biomedical research and medical practice, by illustration of fundamental principles in biomedical informatics research; to stimulate basic research into application software design; to report the state of research of biomedical information processing projects; to report new computer methodologies applied in biomedical areas; the eventual distribution of demonstrable software to avoid duplication of effort; to provide a forum for discussion and improvement of existing software; to optimize contact between national organizations and regional user groups by promoting an international exchange of information on formal methods, standards and software in biomedicine. Computer Methods and Programs in Biomedicine covers computing methodology and software systems derived from computing science for implementation in all aspects of biomedical research and medical practice. It is designed to serve: biochemists; biologists; geneticists; immunologists; neuroscientists; pharmacologists; toxicologists; clinicians; epidemiologists; psychiatrists; psychologists; cardiologists; chemists; (radio)physicists; computer scientists; programmers and systems analysts; biomedical, clinical, electrical and other engineers; teachers of medical informatics and users of educational software.
期刊最新文献
Editorial Board A comprehensive benchmarking of a U-Net based model for midbrain auto-segmentation on transcranial sonography DeepForest-HTP: A novel deep forest approach for predicting antihypertensive peptides Positional encoding-guided transformer-based multiple instance learning for histopathology whole slide images classification Localizing the seizure onset zone and predicting the surgery outcomes in patients with drug-resistant epilepsy: A new approach based on the causal network
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1