通过动态重建和运动估计(DREME)联合框架(DREME),利用单个任意角度的 X 射线投影进行实时 CBCT 成像和运动跟踪

Hua-Chieh Shao, Tielige Mengke, Tinsu Pan, You Zhang
{"title":"通过动态重建和运动估计(DREME)联合框架(DREME),利用单个任意角度的 X 射线投影进行实时 CBCT 成像和运动跟踪","authors":"Hua-Chieh Shao, Tielige Mengke, Tinsu Pan, You Zhang","doi":"arxiv-2409.04614","DOIUrl":null,"url":null,"abstract":"Real-time cone-beam computed tomography (CBCT) provides instantaneous\nvisualization of patient anatomy for image guidance, motion tracking, and\nonline treatment adaptation in radiotherapy. While many real-time imaging and\nmotion tracking methods leveraged patient-specific prior information to\nalleviate under-sampling challenges and meet the temporal constraint (< 500\nms), the prior information can be outdated and introduce biases, thus\ncompromising the imaging and motion tracking accuracy. To address this\nchallenge, we developed a framework (DREME) for real-time CBCT imaging and\nmotion estimation, without relying on patient-specific prior knowledge. DREME\nincorporates a deep learning-based real-time CBCT imaging and motion estimation\nmethod into a dynamic CBCT reconstruction framework. The reconstruction\nframework reconstructs a dynamic sequence of CBCTs in a data-driven manner from\na standard pre-treatment scan, without utilizing patient-specific knowledge.\nMeanwhile, a convolutional neural network-based motion encoder is jointly\ntrained during the reconstruction to learn motion-related features relevant for\nreal-time motion estimation, based on a single arbitrarily-angled x-ray\nprojection. DREME was tested on digital phantom simulation and real patient\nstudies. DREME accurately solved 3D respiration-induced anatomic motion in real\ntime (~1.5 ms inference time for each x-ray projection). In the digital phantom\nstudy, it achieved an average lung tumor center-of-mass localization error of\n1.2$\\pm$0.9 mm (Mean$\\pm$SD). In the patient study, it achieved a real-time\ntumor localization accuracy of 1.8$\\pm$1.6 mm in the projection domain. DREME\nachieves CBCT and volumetric motion estimation in real time from a single x-ray\nprojection at arbitrary angles, paving the way for future clinical applications\nin intra-fractional motion management.","PeriodicalId":501378,"journal":{"name":"arXiv - PHYS - Medical Physics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Real-time CBCT Imaging and Motion Tracking via a Single Arbitrarily-angled X-ray Projection by a Joint Dynamic Reconstruction and Motion Estimation (DREME) Framework (DREME) Framework\",\"authors\":\"Hua-Chieh Shao, Tielige Mengke, Tinsu Pan, You Zhang\",\"doi\":\"arxiv-2409.04614\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Real-time cone-beam computed tomography (CBCT) provides instantaneous\\nvisualization of patient anatomy for image guidance, motion tracking, and\\nonline treatment adaptation in radiotherapy. While many real-time imaging and\\nmotion tracking methods leveraged patient-specific prior information to\\nalleviate under-sampling challenges and meet the temporal constraint (< 500\\nms), the prior information can be outdated and introduce biases, thus\\ncompromising the imaging and motion tracking accuracy. To address this\\nchallenge, we developed a framework (DREME) for real-time CBCT imaging and\\nmotion estimation, without relying on patient-specific prior knowledge. DREME\\nincorporates a deep learning-based real-time CBCT imaging and motion estimation\\nmethod into a dynamic CBCT reconstruction framework. The reconstruction\\nframework reconstructs a dynamic sequence of CBCTs in a data-driven manner from\\na standard pre-treatment scan, without utilizing patient-specific knowledge.\\nMeanwhile, a convolutional neural network-based motion encoder is jointly\\ntrained during the reconstruction to learn motion-related features relevant for\\nreal-time motion estimation, based on a single arbitrarily-angled x-ray\\nprojection. DREME was tested on digital phantom simulation and real patient\\nstudies. DREME accurately solved 3D respiration-induced anatomic motion in real\\ntime (~1.5 ms inference time for each x-ray projection). In the digital phantom\\nstudy, it achieved an average lung tumor center-of-mass localization error of\\n1.2$\\\\pm$0.9 mm (Mean$\\\\pm$SD). In the patient study, it achieved a real-time\\ntumor localization accuracy of 1.8$\\\\pm$1.6 mm in the projection domain. DREME\\nachieves CBCT and volumetric motion estimation in real time from a single x-ray\\nprojection at arbitrary angles, paving the way for future clinical applications\\nin intra-fractional motion management.\",\"PeriodicalId\":501378,\"journal\":{\"name\":\"arXiv - PHYS - Medical Physics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - PHYS - Medical Physics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.04614\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - PHYS - Medical Physics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.04614","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

实时锥束计算机断层扫描(CBCT)为放射治疗中的图像引导、运动跟踪和在线治疗适应提供了病人解剖结构的即时可视化。虽然许多实时成像和运动跟踪方法利用患者特定的先验信息来减轻采样不足的挑战并满足时间限制(< 500 毫秒),但先验信息可能会过时并引入偏差,从而影响成像和运动跟踪的准确性。为了解决这一难题,我们开发了一种用于实时 CBCT 成像和运动估计的框架(DREME),而无需依赖特定患者的先验知识。DREME 将基于深度学习的实时 CBCT 成像和运动估计方法融入动态 CBCT 重建框架。同时,基于卷积神经网络的运动编码器在重建过程中接受联合训练,以学习与实时运动估计相关的运动相关特征,这些特征基于单个任意角度的X射线投影。DREME 在数字模拟模型和真实病人研究中进行了测试。DREME 实时准确地解决了三维呼吸引起的解剖运动问题(每个 X 射线投影的推理时间约为 1.5 毫秒)。在数字人体模型研究中,它实现了平均肺部肿瘤质量中心定位误差为1.2/pm/0.9 mm(平均值/pm/SD)。在患者研究中,投影域的实时肿瘤定位精度为 1.8 mm。DREME实现了从任意角度的单个X射线投影实时进行CBCT和容积运动估计,为未来临床应用中的点内运动管理铺平了道路。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Real-time CBCT Imaging and Motion Tracking via a Single Arbitrarily-angled X-ray Projection by a Joint Dynamic Reconstruction and Motion Estimation (DREME) Framework (DREME) Framework
Real-time cone-beam computed tomography (CBCT) provides instantaneous visualization of patient anatomy for image guidance, motion tracking, and online treatment adaptation in radiotherapy. While many real-time imaging and motion tracking methods leveraged patient-specific prior information to alleviate under-sampling challenges and meet the temporal constraint (< 500 ms), the prior information can be outdated and introduce biases, thus compromising the imaging and motion tracking accuracy. To address this challenge, we developed a framework (DREME) for real-time CBCT imaging and motion estimation, without relying on patient-specific prior knowledge. DREME incorporates a deep learning-based real-time CBCT imaging and motion estimation method into a dynamic CBCT reconstruction framework. The reconstruction framework reconstructs a dynamic sequence of CBCTs in a data-driven manner from a standard pre-treatment scan, without utilizing patient-specific knowledge. Meanwhile, a convolutional neural network-based motion encoder is jointly trained during the reconstruction to learn motion-related features relevant for real-time motion estimation, based on a single arbitrarily-angled x-ray projection. DREME was tested on digital phantom simulation and real patient studies. DREME accurately solved 3D respiration-induced anatomic motion in real time (~1.5 ms inference time for each x-ray projection). In the digital phantom study, it achieved an average lung tumor center-of-mass localization error of 1.2$\pm$0.9 mm (Mean$\pm$SD). In the patient study, it achieved a real-time tumor localization accuracy of 1.8$\pm$1.6 mm in the projection domain. DREME achieves CBCT and volumetric motion estimation in real time from a single x-ray projection at arbitrary angles, paving the way for future clinical applications in intra-fractional motion management.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Experimental Learning of a Hyperelastic Behavior with a Physics-Augmented Neural Network Modeling water radiolysis with Geant4-DNA: Impact of the temporal structure of the irradiation pulse under oxygen conditions Fast Spot Order Optimization to Increase Dose Rates in Scanned Particle Therapy FLASH Treatments The i-TED Compton Camera Array for real-time boron imaging and determination during treatments in Boron Neutron Capture Therapy OpenDosimeter: Open Hardware Personal X-ray Dosimeter
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1