Hailin Yue;Jin Liu;Lina Zhao;Hulin Kuang;Jianhong Cheng;Junjian Li;Mengshen He;Jie Gong;Jianxin Wang
{"title":"A2HTL:利用CT图像预测食管癌存活率的基于混合变压器的自动学习方法","authors":"Hailin Yue;Jin Liu;Lina Zhao;Hulin Kuang;Jianhong Cheng;Junjian Li;Mengshen He;Jie Gong;Jianxin Wang","doi":"10.1109/TNB.2024.3441533","DOIUrl":null,"url":null,"abstract":"Esophageal cancer is a common malignant tumor, precisely predicting survival of esophageal cancer is crucial for personalized treatment. However, current region of interest (ROI) based methodologies not only necessitate prior medical knowledge for tumor delineation, but may also cause the model to be overly sensitive to ROI. To address these challenges, we develop an automated Hybrid Transformer based learning that integrates a Hybrid Transformer size-aware U-Net with a ranked survival prediction network to enable automatic survival prediction for esophageal cancer. Specifically, we first incorporate the Transformer with shifted windowing multi-head self-attention mechanism (SW-MSA) into the base of the U-Net encoder to capture the long-range dependency in CT images. Furthermore, to alleviate the imbalance between the ROI and the background in CT images, we devise a size-aware coefficient for the segmentation loss. Finally, we also design a ranked pair sorting loss to more comprehensively capture the ranked information inherent in CT images. We evaluate our proposed method on a dataset comprising 759 samples with esophageal cancer. Experimental results demonstrate the superior performance of our proposed method in survival prediction, even without ROI ground truth.","PeriodicalId":13264,"journal":{"name":"IEEE Transactions on NanoBioscience","volume":"23 4","pages":"548-555"},"PeriodicalIF":3.7000,"publicationDate":"2024-08-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A2HTL: An Automated Hybrid Transformer-Based Learning for Predicting Survival of Esophageal Cancer Using CT Images\",\"authors\":\"Hailin Yue;Jin Liu;Lina Zhao;Hulin Kuang;Jianhong Cheng;Junjian Li;Mengshen He;Jie Gong;Jianxin Wang\",\"doi\":\"10.1109/TNB.2024.3441533\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Esophageal cancer is a common malignant tumor, precisely predicting survival of esophageal cancer is crucial for personalized treatment. However, current region of interest (ROI) based methodologies not only necessitate prior medical knowledge for tumor delineation, but may also cause the model to be overly sensitive to ROI. To address these challenges, we develop an automated Hybrid Transformer based learning that integrates a Hybrid Transformer size-aware U-Net with a ranked survival prediction network to enable automatic survival prediction for esophageal cancer. Specifically, we first incorporate the Transformer with shifted windowing multi-head self-attention mechanism (SW-MSA) into the base of the U-Net encoder to capture the long-range dependency in CT images. Furthermore, to alleviate the imbalance between the ROI and the background in CT images, we devise a size-aware coefficient for the segmentation loss. Finally, we also design a ranked pair sorting loss to more comprehensively capture the ranked information inherent in CT images. We evaluate our proposed method on a dataset comprising 759 samples with esophageal cancer. Experimental results demonstrate the superior performance of our proposed method in survival prediction, even without ROI ground truth.\",\"PeriodicalId\":13264,\"journal\":{\"name\":\"IEEE Transactions on NanoBioscience\",\"volume\":\"23 4\",\"pages\":\"548-555\"},\"PeriodicalIF\":3.7000,\"publicationDate\":\"2024-08-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on NanoBioscience\",\"FirstCategoryId\":\"99\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10633746/\",\"RegionNum\":4,\"RegionCategory\":\"生物学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"BIOCHEMICAL RESEARCH METHODS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on NanoBioscience","FirstCategoryId":"99","ListUrlMain":"https://ieeexplore.ieee.org/document/10633746/","RegionNum":4,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"BIOCHEMICAL RESEARCH METHODS","Score":null,"Total":0}
引用次数: 0
摘要
食管癌是一种常见的恶性肿瘤,精确预测食管癌的生存率对个性化治疗至关重要。然而,目前基于兴趣区域(ROI)的方法不仅需要事先掌握肿瘤划分的医学知识,还可能导致模型对 ROI 过度敏感。为了应对这些挑战,我们开发了一种基于混合变形器的自动学习方法,它将混合变形器尺寸感知 U-Net 与排序生存预测网络整合在一起,实现了食管癌的自动生存预测。具体来说,我们首先在 UNet 编码器的基础上加入了带有移位窗口多头自关注机制(SW-MSA)的变换器,以捕捉 CT 图像中的长程依赖性。此外,为了缓解 CT 图像中 ROI 与背景之间的不平衡,我们设计了一个尺寸感知系数来计算分割损失。最后,我们还设计了排序对排序损失,以更全面地捕捉 CT 图像中固有的排序信息。我们在由 759 个食道癌样本组成的数据集上评估了我们提出的方法。实验结果表明,即使在没有 ROI 地面实况的情况下,我们提出的方法在生存预测方面也表现出色。
A2HTL: An Automated Hybrid Transformer-Based Learning for Predicting Survival of Esophageal Cancer Using CT Images
Esophageal cancer is a common malignant tumor, precisely predicting survival of esophageal cancer is crucial for personalized treatment. However, current region of interest (ROI) based methodologies not only necessitate prior medical knowledge for tumor delineation, but may also cause the model to be overly sensitive to ROI. To address these challenges, we develop an automated Hybrid Transformer based learning that integrates a Hybrid Transformer size-aware U-Net with a ranked survival prediction network to enable automatic survival prediction for esophageal cancer. Specifically, we first incorporate the Transformer with shifted windowing multi-head self-attention mechanism (SW-MSA) into the base of the U-Net encoder to capture the long-range dependency in CT images. Furthermore, to alleviate the imbalance between the ROI and the background in CT images, we devise a size-aware coefficient for the segmentation loss. Finally, we also design a ranked pair sorting loss to more comprehensively capture the ranked information inherent in CT images. We evaluate our proposed method on a dataset comprising 759 samples with esophageal cancer. Experimental results demonstrate the superior performance of our proposed method in survival prediction, even without ROI ground truth.
期刊介绍:
The IEEE Transactions on NanoBioscience reports on original, innovative and interdisciplinary work on all aspects of molecular systems, cellular systems, and tissues (including molecular electronics). Topics covered in the journal focus on a broad spectrum of aspects, both on foundations and on applications. Specifically, methods and techniques, experimental aspects, design and implementation, instrumentation and laboratory equipment, clinical aspects, hardware and software data acquisition and analysis and computer based modelling are covered (based on traditional or high performance computing - parallel computers or computer networks).