基于锥束计算机断层扫描的渐进式自动分割在线自适应放射治疗

Hengrui Zhao, Xiao Liang, Boyu Meng, Michael Dohopolski, Byongsu Choi, Bin Cai, Mu-Han Lin, Ti Bai, Dan Nguyen, Steve Jiang
{"title":"基于锥束计算机断层扫描的渐进式自动分割在线自适应放射治疗","authors":"Hengrui Zhao,&nbsp;Xiao Liang,&nbsp;Boyu Meng,&nbsp;Michael Dohopolski,&nbsp;Byongsu Choi,&nbsp;Bin Cai,&nbsp;Mu-Han Lin,&nbsp;Ti Bai,&nbsp;Dan Nguyen,&nbsp;Steve Jiang","doi":"10.1016/j.phro.2024.100610","DOIUrl":null,"url":null,"abstract":"<div><h3>Background and purpose</h3><p>Accurate and automated segmentation of targets and organs-at-risk (OARs) is crucial for the successful clinical application of online adaptive radiotherapy (ART). Current methods for cone-beam computed tomography (CBCT) auto-segmentation face challenges, resulting in segmentations often failing to reach clinical acceptability. Current approaches for CBCT auto-segmentation overlook the wealth of information available from initial planning and prior adaptive fractions that could enhance segmentation precision.</p></div><div><h3>Materials and methods</h3><p>We introduce a novel framework that incorporates data from a patient’s initial plan and previous adaptive fractions, harnessing this additional temporal context to significantly refine the segmentation accuracy for the current fraction’s CBCT images. We present LSTM-UNet, an innovative architecture that integrates Long Short-Term Memory (LSTM) units into the skip connections of the traditional U-Net framework to retain information from previous fractions. The models underwent initial pre-training with simulated data followed by fine-tuning on a clinical dataset.</p></div><div><h3>Results</h3><p>Our proposed model’s segmentation predictions yield an average Dice similarity coefficient of 79% from 8 Head &amp; Neck organs and targets, compared to 52% from a baseline model without prior knowledge and 78% from a baseline model with prior knowledge but no memory.</p></div><div><h3>Conclusions</h3><p>Our proposed model excels beyond baseline segmentation frameworks by effectively utilizing information from prior fractions, thus reducing the effort of clinicians to revise the auto-segmentation results. Moreover, it works together with registration-based methods that offer better prior knowledge. Our model holds promise for integration into the online ART workflow, offering precise segmentation capabilities on synthetic CT images.</p></div>","PeriodicalId":36850,"journal":{"name":"Physics and Imaging in Radiation Oncology","volume":null,"pages":null},"PeriodicalIF":3.4000,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2405631624000800/pdfft?md5=f95835cfab39bce24fc884853673897d&pid=1-s2.0-S2405631624000800-main.pdf","citationCount":"0","resultStr":"{\"title\":\"Progressive auto-segmentation for cone-beam computed tomography-based online adaptive radiotherapy\",\"authors\":\"Hengrui Zhao,&nbsp;Xiao Liang,&nbsp;Boyu Meng,&nbsp;Michael Dohopolski,&nbsp;Byongsu Choi,&nbsp;Bin Cai,&nbsp;Mu-Han Lin,&nbsp;Ti Bai,&nbsp;Dan Nguyen,&nbsp;Steve Jiang\",\"doi\":\"10.1016/j.phro.2024.100610\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><h3>Background and purpose</h3><p>Accurate and automated segmentation of targets and organs-at-risk (OARs) is crucial for the successful clinical application of online adaptive radiotherapy (ART). Current methods for cone-beam computed tomography (CBCT) auto-segmentation face challenges, resulting in segmentations often failing to reach clinical acceptability. Current approaches for CBCT auto-segmentation overlook the wealth of information available from initial planning and prior adaptive fractions that could enhance segmentation precision.</p></div><div><h3>Materials and methods</h3><p>We introduce a novel framework that incorporates data from a patient’s initial plan and previous adaptive fractions, harnessing this additional temporal context to significantly refine the segmentation accuracy for the current fraction’s CBCT images. We present LSTM-UNet, an innovative architecture that integrates Long Short-Term Memory (LSTM) units into the skip connections of the traditional U-Net framework to retain information from previous fractions. The models underwent initial pre-training with simulated data followed by fine-tuning on a clinical dataset.</p></div><div><h3>Results</h3><p>Our proposed model’s segmentation predictions yield an average Dice similarity coefficient of 79% from 8 Head &amp; Neck organs and targets, compared to 52% from a baseline model without prior knowledge and 78% from a baseline model with prior knowledge but no memory.</p></div><div><h3>Conclusions</h3><p>Our proposed model excels beyond baseline segmentation frameworks by effectively utilizing information from prior fractions, thus reducing the effort of clinicians to revise the auto-segmentation results. Moreover, it works together with registration-based methods that offer better prior knowledge. Our model holds promise for integration into the online ART workflow, offering precise segmentation capabilities on synthetic CT images.</p></div>\",\"PeriodicalId\":36850,\"journal\":{\"name\":\"Physics and Imaging in Radiation Oncology\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":3.4000,\"publicationDate\":\"2024-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S2405631624000800/pdfft?md5=f95835cfab39bce24fc884853673897d&pid=1-s2.0-S2405631624000800-main.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Physics and Imaging in Radiation Oncology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2405631624000800\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ONCOLOGY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Physics and Imaging in Radiation Oncology","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2405631624000800","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ONCOLOGY","Score":null,"Total":0}
引用次数: 0

摘要

背景和目的靶点和危险器官(OAR)的精确自动分割对于在线自适应放射治疗(ART)的成功临床应用至关重要。目前的锥束计算机断层扫描(CBCT)自动分割方法面临挑战,导致分割结果往往无法达到临床可接受性。目前的 CBCT 自动分割方法忽略了来自初始计划和先前自适应分段的大量信息,而这些信息可以提高分割的精确度。材料与方法 我们介绍了一种新颖的框架,该框架结合了来自患者初始计划和先前自适应分段的数据,利用这些额外的时间背景来显著提高当前分段 CBCT 图像的分割精确度。我们提出的 LSTM-UNet 是一种创新架构,它将长短期记忆(LSTM)单元集成到传统 U-Net 框架的跳接连接中,以保留以前分数的信息。结果我们提出的模型对 8 个头部及颈部器官和目标的分割预测得出的平均 Dice 相似系数为 79%,而无先验知识的基线模型为 52%,有先验知识但无记忆的基线模型为 78%。结论我们提出的模型超越了基线分割框架,有效地利用了以前的分割信息,从而减少了临床医生修改自动分割结果的工作量。此外,它还能与提供更好先验知识的基于配准的方法配合使用。我们的模型有望集成到在线 ART 工作流程中,为合成 CT 图像提供精确的分割功能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Progressive auto-segmentation for cone-beam computed tomography-based online adaptive radiotherapy

Background and purpose

Accurate and automated segmentation of targets and organs-at-risk (OARs) is crucial for the successful clinical application of online adaptive radiotherapy (ART). Current methods for cone-beam computed tomography (CBCT) auto-segmentation face challenges, resulting in segmentations often failing to reach clinical acceptability. Current approaches for CBCT auto-segmentation overlook the wealth of information available from initial planning and prior adaptive fractions that could enhance segmentation precision.

Materials and methods

We introduce a novel framework that incorporates data from a patient’s initial plan and previous adaptive fractions, harnessing this additional temporal context to significantly refine the segmentation accuracy for the current fraction’s CBCT images. We present LSTM-UNet, an innovative architecture that integrates Long Short-Term Memory (LSTM) units into the skip connections of the traditional U-Net framework to retain information from previous fractions. The models underwent initial pre-training with simulated data followed by fine-tuning on a clinical dataset.

Results

Our proposed model’s segmentation predictions yield an average Dice similarity coefficient of 79% from 8 Head & Neck organs and targets, compared to 52% from a baseline model without prior knowledge and 78% from a baseline model with prior knowledge but no memory.

Conclusions

Our proposed model excels beyond baseline segmentation frameworks by effectively utilizing information from prior fractions, thus reducing the effort of clinicians to revise the auto-segmentation results. Moreover, it works together with registration-based methods that offer better prior knowledge. Our model holds promise for integration into the online ART workflow, offering precise segmentation capabilities on synthetic CT images.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Physics and Imaging in Radiation Oncology
Physics and Imaging in Radiation Oncology Physics and Astronomy-Radiation
CiteScore
5.30
自引率
18.90%
发文量
93
审稿时长
6 weeks
期刊最新文献
Results of 2023 survey on the use of synthetic computed tomography for magnetic resonance Imaging-only radiotherapy: Current status and future steps Head and neck automatic multi-organ segmentation on Dual-Energy Computed Tomography Automatic segmentation for magnetic resonance imaging guided individual elective lymph node irradiation in head and neck cancer patients Development of a novel 3D-printed dynamic anthropomorphic thorax phantom for evaluation of four-dimensional computed tomography Technical feasibility of delivering a simultaneous integrated boost in partial breast irradiation
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1