Deep learning-based automatic pipeline for 3D needle localization on intra-procedural 3D MRI.

IF 2.3 3区 医学 Q3 ENGINEERING, BIOMEDICAL International Journal of Computer Assisted Radiology and Surgery Pub Date : 2024-11-01 Epub Date: 2024-03-23 DOI:10.1007/s11548-024-03077-3
Wenqi Zhou, Xinzhou Li, Fatemeh Zabihollahy, David S Lu, Holden H Wu
{"title":"Deep learning-based automatic pipeline for 3D needle localization on intra-procedural 3D MRI.","authors":"Wenqi Zhou, Xinzhou Li, Fatemeh Zabihollahy, David S Lu, Holden H Wu","doi":"10.1007/s11548-024-03077-3","DOIUrl":null,"url":null,"abstract":"<p><strong>Purpose: </strong>Accurate and rapid needle localization on 3D magnetic resonance imaging (MRI) is critical for MRI-guided percutaneous interventions. The current workflow requires manual needle localization on 3D MRI, which is time-consuming and cumbersome. Automatic methods using 2D deep learning networks for needle segmentation require manual image plane localization, while 3D networks are challenged by the need for sufficient training datasets. This work aimed to develop an automatic deep learning-based pipeline for accurate and rapid 3D needle localization on in vivo intra-procedural 3D MRI using a limited training dataset.</p><p><strong>Methods: </strong>The proposed automatic pipeline adopted Shifted Window (Swin) Transformers and employed a coarse-to-fine segmentation strategy: (1) initial 3D needle feature segmentation with 3D Swin UNEt TRansfomer (UNETR); (2) generation of a 2D reformatted image containing the needle feature; (3) fine 2D needle feature segmentation with 2D Swin Transformer and calculation of 3D needle tip position and axis orientation. Pre-training and data augmentation were performed to improve network training. The pipeline was evaluated via cross-validation with 49 in vivo intra-procedural 3D MR images from preclinical pig experiments. The needle tip and axis localization errors were compared with human intra-reader variation using the Wilcoxon signed rank test, with p < 0.05 considered significant.</p><p><strong>Results: </strong>The average end-to-end computational time for the pipeline was 6 s per 3D volume. The median Dice scores of the 3D Swin UNETR and 2D Swin Transformer in the pipeline were 0.80 and 0.93, respectively. The median 3D needle tip and axis localization errors were 1.48 mm (1.09 pixels) and 0.98°, respectively. Needle tip localization errors were significantly smaller than human intra-reader variation (median 1.70 mm; p < 0.01).</p><p><strong>Conclusion: </strong>The proposed automatic pipeline achieved rapid pixel-level 3D needle localization on intra-procedural 3D MRI without requiring a large 3D training dataset and has the potential to assist MRI-guided percutaneous interventions.</p>","PeriodicalId":51251,"journal":{"name":"International Journal of Computer Assisted Radiology and Surgery","volume":null,"pages":null},"PeriodicalIF":2.3000,"publicationDate":"2024-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11541278/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Computer Assisted Radiology and Surgery","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1007/s11548-024-03077-3","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/3/23 0:00:00","PubModel":"Epub","JCR":"Q3","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Purpose: Accurate and rapid needle localization on 3D magnetic resonance imaging (MRI) is critical for MRI-guided percutaneous interventions. The current workflow requires manual needle localization on 3D MRI, which is time-consuming and cumbersome. Automatic methods using 2D deep learning networks for needle segmentation require manual image plane localization, while 3D networks are challenged by the need for sufficient training datasets. This work aimed to develop an automatic deep learning-based pipeline for accurate and rapid 3D needle localization on in vivo intra-procedural 3D MRI using a limited training dataset.

Methods: The proposed automatic pipeline adopted Shifted Window (Swin) Transformers and employed a coarse-to-fine segmentation strategy: (1) initial 3D needle feature segmentation with 3D Swin UNEt TRansfomer (UNETR); (2) generation of a 2D reformatted image containing the needle feature; (3) fine 2D needle feature segmentation with 2D Swin Transformer and calculation of 3D needle tip position and axis orientation. Pre-training and data augmentation were performed to improve network training. The pipeline was evaluated via cross-validation with 49 in vivo intra-procedural 3D MR images from preclinical pig experiments. The needle tip and axis localization errors were compared with human intra-reader variation using the Wilcoxon signed rank test, with p < 0.05 considered significant.

Results: The average end-to-end computational time for the pipeline was 6 s per 3D volume. The median Dice scores of the 3D Swin UNETR and 2D Swin Transformer in the pipeline were 0.80 and 0.93, respectively. The median 3D needle tip and axis localization errors were 1.48 mm (1.09 pixels) and 0.98°, respectively. Needle tip localization errors were significantly smaller than human intra-reader variation (median 1.70 mm; p < 0.01).

Conclusion: The proposed automatic pipeline achieved rapid pixel-level 3D needle localization on intra-procedural 3D MRI without requiring a large 3D training dataset and has the potential to assist MRI-guided percutaneous interventions.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于深度学习的自动管道,用于术中三维磁共振成像的三维针定位。
目的:在三维磁共振成像(MRI)上准确、快速地定位穿刺针对于 MRI 引导下的经皮介入治疗至关重要。目前的工作流程需要在三维核磁共振成像上进行手动针定位,既费时又繁琐。使用二维深度学习网络进行针头分割的自动方法需要人工图像平面定位,而三维网络则因需要足够的训练数据集而面临挑战。这项工作旨在开发一种基于深度学习的自动流水线,利用有限的训练数据集,在活体术中三维核磁共振成像上实现准确、快速的三维针定位:拟议的自动流水线采用移位窗(Swin)变换器,并采用从粗到细的分割策略:(1) 使用 3D Swin UNEt TRansfomer (UNETR)进行初始三维针特征分割;(2) 生成包含针特征的二维重新格式化图像;(3) 使用 2D Swin 变换器进行精细二维针特征分割,并计算三维针尖位置和轴方向。为改进网络训练,还进行了预训练和数据增强。通过对临床前猪实验中的 49 幅体内术中三维 MR 图像进行交叉验证,对该管道进行了评估。使用 Wilcoxon 符号秩检验比较了针尖和针轴定位误差与人类读取器内部差异,结果为 p:每个三维卷的端到端计算时间平均为 6 秒。管道中三维 Swin UNETR 和二维 Swin Transformer 的中位 Dice 分数分别为 0.80 和 0.93。三维针尖和轴定位误差的中位数分别为 1.48 毫米(1.09 像素)和 0.98°。针尖定位误差明显小于人为读取器内部的误差(中位数为 1.70 毫米;p 结论:针尖定位误差和轴定位误差的中位数分别为 1.48 毫米(1.09 像素)和 0.98°:所提出的自动管道无需大量三维训练数据集即可在术中三维 MRI 上实现快速像素级三维针定位,有望为 MRI 引导下的经皮介入治疗提供帮助。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
International Journal of Computer Assisted Radiology and Surgery
International Journal of Computer Assisted Radiology and Surgery ENGINEERING, BIOMEDICAL-RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING
CiteScore
5.90
自引率
6.70%
发文量
243
审稿时长
6-12 weeks
期刊介绍: The International Journal for Computer Assisted Radiology and Surgery (IJCARS) is a peer-reviewed journal that provides a platform for closing the gap between medical and technical disciplines, and encourages interdisciplinary research and development activities in an international environment.
期刊最新文献
Optimization of percutaneous intervention robotic system for skin insertion force. Correction to: Micro-robotic percutaneous targeting of type II endoleaks in the angio-suite. Automated assessment of non-technical skills by heart-rate data. Artificial intelligence-based analysis of lower limb muscle mass and fatty degeneration in patients with knee osteoarthritis and its correlation with Knee Society Score. High-quality semi-supervised anomaly detection with generative adversarial networks.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1