Spatial and channel attention-based conditional Wasserstein GAN for direct and rapid image reconstruction in ultrasound computed tomography.

IF 3.2 4区 医学 Q2 ENGINEERING, BIOMEDICAL Biomedical Engineering Letters Pub Date : 2023-08-17 eCollection Date: 2024-01-01 DOI:10.1007/s13534-023-00310-x
Xiaoyun Long, Chao Tian
{"title":"Spatial and channel attention-based conditional Wasserstein GAN for direct and rapid image reconstruction in ultrasound computed tomography.","authors":"Xiaoyun Long, Chao Tian","doi":"10.1007/s13534-023-00310-x","DOIUrl":null,"url":null,"abstract":"<p><p>Ultrasound computed tomography (USCT) is an emerging technology that offers a noninvasive and radiation-free imaging approach with high sensitivity, making it promising for the early detection and diagnosis of breast cancer. The speed-of-sound (SOS) parameter plays a crucial role in distinguishing between benign masses and breast cancer. However, traditional SOS reconstruction methods face challenges in achieving a balance between resolution and computational efficiency, which hinders their clinical applications due to high computational complexity and long reconstruction times. In this paper, we propose a novel and efficient approach for direct SOS image reconstruction based on an improved conditional generative adversarial network. The generator directly reconstructs SOS images from time-of-flight information, eliminating the need for intermediate steps. Residual spatial-channel attention blocks are integrated into the generator to adaptively determine the relevance of arrival time from the transducer pair corresponding to each pixel in the SOS image. An ablation study verified the effectiveness of this module. Qualitative and quantitative evaluation results on breast phantom datasets demonstrate that this method is capable of rapidly reconstructing high-quality SOS images, achieving better generation results and image quality. Therefore, we believe that the proposed algorithm represents a new direction in the research area of USCT SOS reconstruction.</p>","PeriodicalId":46898,"journal":{"name":"Biomedical Engineering Letters","volume":null,"pages":null},"PeriodicalIF":3.2000,"publicationDate":"2023-08-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10770017/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Biomedical Engineering Letters","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1007/s13534-023-00310-x","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/1/1 0:00:00","PubModel":"eCollection","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Ultrasound computed tomography (USCT) is an emerging technology that offers a noninvasive and radiation-free imaging approach with high sensitivity, making it promising for the early detection and diagnosis of breast cancer. The speed-of-sound (SOS) parameter plays a crucial role in distinguishing between benign masses and breast cancer. However, traditional SOS reconstruction methods face challenges in achieving a balance between resolution and computational efficiency, which hinders their clinical applications due to high computational complexity and long reconstruction times. In this paper, we propose a novel and efficient approach for direct SOS image reconstruction based on an improved conditional generative adversarial network. The generator directly reconstructs SOS images from time-of-flight information, eliminating the need for intermediate steps. Residual spatial-channel attention blocks are integrated into the generator to adaptively determine the relevance of arrival time from the transducer pair corresponding to each pixel in the SOS image. An ablation study verified the effectiveness of this module. Qualitative and quantitative evaluation results on breast phantom datasets demonstrate that this method is capable of rapidly reconstructing high-quality SOS images, achieving better generation results and image quality. Therefore, we believe that the proposed algorithm represents a new direction in the research area of USCT SOS reconstruction.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于空间和通道注意力的条件Wasserstein GAN用于超声计算机断层扫描中的直接快速图像重建
超声波计算机断层扫描(USCT)是一项新兴技术,它提供了一种无创、无辐射、高灵敏度的成像方法,使其在乳腺癌的早期检测和诊断中大有可为。声速(SOS)参数在区分良性肿块和乳腺癌方面起着至关重要的作用。然而,传统的 SOS 重建方法在实现分辨率和计算效率之间的平衡方面面临挑战,计算复杂度高、重建时间长,阻碍了其临床应用。在本文中,我们提出了一种基于改进型条件生成对抗网络的新型高效直接 SOS 图像重建方法。生成器直接根据飞行时间信息重建 SOS 图像,省去了中间步骤。生成器中集成了残余空间通道注意块,以自适应地确定 SOS 图像中每个像素对应的换能器对的到达时间的相关性。一项消融研究验证了该模块的有效性。乳腺模型数据集的定性和定量评估结果表明,这种方法能够快速重建高质量的 SOS 图像,获得更好的生成结果和图像质量。因此,我们认为所提出的算法代表了 USCT SOS 重建研究领域的一个新方向。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Biomedical Engineering Letters
Biomedical Engineering Letters ENGINEERING, BIOMEDICAL-
CiteScore
6.80
自引率
0.00%
发文量
34
期刊介绍: Biomedical Engineering Letters (BMEL) aims to present the innovative experimental science and technological development in the biomedical field as well as clinical application of new development. The article must contain original biomedical engineering content, defined as development, theoretical analysis, and evaluation/validation of a new technique. BMEL publishes the following types of papers: original articles, review articles, editorials, and letters to the editor. All the papers are reviewed in single-blind fashion.
期刊最新文献
CT synthesis with deep learning for MR-only radiotherapy planning: a review. A comprehensive review on Compton camera image reconstruction: from principles to AI innovations. A review of deep learning-based reconstruction methods for accelerated MRI using spatiotemporal and multi-contrast redundancies. Strategies for mitigating inter-crystal scattering effects in positron emission tomography: a comprehensive review. Self-supervised learning for CT image denoising and reconstruction: a review.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1