基于相似度度量和CycleGAN的混合自动分割提高可变形图像配准精度。

ArXiv Pub Date : 2024-11-25
Keyur D Shah, James A Shackleford, Nagarajan Kandasamy, Gregory C Sharp
{"title":"基于相似度度量和CycleGAN的混合自动分割提高可变形图像配准精度。","authors":"Keyur D Shah, James A Shackleford, Nagarajan Kandasamy, Gregory C Sharp","doi":"","DOIUrl":null,"url":null,"abstract":"<p><strong>Purpose: </strong>Deformable image registration (DIR) plays a critical role in adaptive radiation therapy (ART) to accommodate anatomical changes. However, conventional intensity-based DIR methods face challenges when registering images with unequal image intensities. In these cases, DIR accuracy can be improved using a hybrid image similarity metric which matches both image intensities and the location of known structures. This study aims to assess DIR accuracy using a hybrid similarity metric and leveraging CycleGAN-based intensity correction and auto-segmentation and comparing performance across three DIR workflows.</p><p><strong>Methods: </strong>The proposed approach incorporates a hybrid image similarity metric combining a point-to-distance (PD) score and intensity similarity score. Synthetic CT (sCT) images were generated using a 2D CycleGAN model trained on unpaired CT and CBCT images, improving soft-tissue contrast in CBCT images. The performance of the approach was evaluated by comparing three DIR workflows: (1) traditional intensity-based (No PD), (2) auto-segmented contours on sCT (CycleGAN PD), and (3) expert manual contours (Expert PD). A 3D U-Net model was then trained on two datasets comprising 56 3D images and validated on 14 independent cases to segment the prostate, bladder, and rectum. DIR accuracy was assessed using Dice Similarity Coefficient (DSC), 95% Hausdorff Distance (HD), and fiducial separation metrics.</p><p><strong>Results: </strong>The hybrid similarity metric significantly improved DIR accuracy. For the prostate, DSC increased from 0.61 ± 0.18 (No PD) to 0.82 ± 0.13 (CycleGAN PD) and 0.89 ± 0.05 (Expert PD), with corresponding reductions in 95% HD from 11.75 mm to 4.86 mm and 3.27 mm, respectively. Fiducial separation was also reduced from 8.95 mm to 4.07 mm (CycleGAN PD) and 4.11 mm (Expert PD) (p < 0.05). Improvements in alignment were also observed for the bladder and rectum, highlighting the method's robustness.</p><p><strong>Conclusion: </strong>A hybrid similarity metric that uses CycleGAN-based auto-segmentation presents a promising avenue for advancing DIR accuracy in ART. The study's findings suggest the potential for substantial enhancements in DIR accuracy by combining AI-based image correction and auto-segmentation with classical DIR.</p>","PeriodicalId":93888,"journal":{"name":"ArXiv","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-11-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11623701/pdf/","citationCount":"0","resultStr":"{\"title\":\"Improving Deformable Image Registration Accuracy through a Hybrid Similarity Metric and CycleGAN Based Auto-Segmentation.\",\"authors\":\"Keyur D Shah, James A Shackleford, Nagarajan Kandasamy, Gregory C Sharp\",\"doi\":\"\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Purpose: </strong>Deformable image registration (DIR) plays a critical role in adaptive radiation therapy (ART) to accommodate anatomical changes. However, conventional intensity-based DIR methods face challenges when registering images with unequal image intensities. In these cases, DIR accuracy can be improved using a hybrid image similarity metric which matches both image intensities and the location of known structures. This study aims to assess DIR accuracy using a hybrid similarity metric and leveraging CycleGAN-based intensity correction and auto-segmentation and comparing performance across three DIR workflows.</p><p><strong>Methods: </strong>The proposed approach incorporates a hybrid image similarity metric combining a point-to-distance (PD) score and intensity similarity score. Synthetic CT (sCT) images were generated using a 2D CycleGAN model trained on unpaired CT and CBCT images, improving soft-tissue contrast in CBCT images. The performance of the approach was evaluated by comparing three DIR workflows: (1) traditional intensity-based (No PD), (2) auto-segmented contours on sCT (CycleGAN PD), and (3) expert manual contours (Expert PD). A 3D U-Net model was then trained on two datasets comprising 56 3D images and validated on 14 independent cases to segment the prostate, bladder, and rectum. DIR accuracy was assessed using Dice Similarity Coefficient (DSC), 95% Hausdorff Distance (HD), and fiducial separation metrics.</p><p><strong>Results: </strong>The hybrid similarity metric significantly improved DIR accuracy. For the prostate, DSC increased from 0.61 ± 0.18 (No PD) to 0.82 ± 0.13 (CycleGAN PD) and 0.89 ± 0.05 (Expert PD), with corresponding reductions in 95% HD from 11.75 mm to 4.86 mm and 3.27 mm, respectively. Fiducial separation was also reduced from 8.95 mm to 4.07 mm (CycleGAN PD) and 4.11 mm (Expert PD) (p < 0.05). Improvements in alignment were also observed for the bladder and rectum, highlighting the method's robustness.</p><p><strong>Conclusion: </strong>A hybrid similarity metric that uses CycleGAN-based auto-segmentation presents a promising avenue for advancing DIR accuracy in ART. The study's findings suggest the potential for substantial enhancements in DIR accuracy by combining AI-based image correction and auto-segmentation with classical DIR.</p>\",\"PeriodicalId\":93888,\"journal\":{\"name\":\"ArXiv\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-11-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11623701/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ArXiv\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ArXiv","FirstCategoryId":"1085","ListUrlMain":"","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

目的:形变图像配准(DIR)在适应性放射治疗(ART)中至关重要,可以解释解剖变化。当图像强度不同时,传统的基于强度的DIR方法往往会失败。本研究评估了结合强度和结构信息的混合相似性度量,利用基于cyclegan的强度校正和跨三个DIR工作流程的自动分割。方法:采用点距离(PD)评分和强度相似度相结合的混合相似度度量法。合成CT (sCT)图像是使用未配对CT和CBCT图像训练的2D CycleGAN模型生成的,以增强软组织对比度。比较的DIR工作流程包括:(1)传统的基于强度的(No PD), (2) sCT上的自动分割轮廓(CycleGAN PD)和(3)专家手动轮廓(expert PD)。三维U-Net模型在56张图像上进行训练,并在14例病例上进行了验证。使用Dice Similarity Coefficient (DSC)、95% Hausdorff Distance (HD)和基准分离来评估DIR的准确性。结果:混合指标提高了DIR的准确性。对于前列腺,DSC从0.61+/-0.18 (No PD)增加到0.82+/-0.13 (CycleGAN PD)和0.89+/-0.05 (Expert PD), 95% HD分别从11.75 mm减少到4.86 mm和3.27 mm。基准间距从8.95 mm降至4.07 mm (CycleGAN PD)和4.11 mm (Expert PD) (p < 0.05)。膀胱和直肠也有改善。结论:本研究表明,使用基于cyclegan的混合相似度度量可以提高DIR的准确性,特别是对于低对比度的CBCT图像。这些发现强调了将基于人工智能的图像校正和分割整合到ART工作流程中以提高精度和简化临床流程的潜力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Improving Deformable Image Registration Accuracy through a Hybrid Similarity Metric and CycleGAN Based Auto-Segmentation.

Purpose: Deformable image registration (DIR) plays a critical role in adaptive radiation therapy (ART) to accommodate anatomical changes. However, conventional intensity-based DIR methods face challenges when registering images with unequal image intensities. In these cases, DIR accuracy can be improved using a hybrid image similarity metric which matches both image intensities and the location of known structures. This study aims to assess DIR accuracy using a hybrid similarity metric and leveraging CycleGAN-based intensity correction and auto-segmentation and comparing performance across three DIR workflows.

Methods: The proposed approach incorporates a hybrid image similarity metric combining a point-to-distance (PD) score and intensity similarity score. Synthetic CT (sCT) images were generated using a 2D CycleGAN model trained on unpaired CT and CBCT images, improving soft-tissue contrast in CBCT images. The performance of the approach was evaluated by comparing three DIR workflows: (1) traditional intensity-based (No PD), (2) auto-segmented contours on sCT (CycleGAN PD), and (3) expert manual contours (Expert PD). A 3D U-Net model was then trained on two datasets comprising 56 3D images and validated on 14 independent cases to segment the prostate, bladder, and rectum. DIR accuracy was assessed using Dice Similarity Coefficient (DSC), 95% Hausdorff Distance (HD), and fiducial separation metrics.

Results: The hybrid similarity metric significantly improved DIR accuracy. For the prostate, DSC increased from 0.61 ± 0.18 (No PD) to 0.82 ± 0.13 (CycleGAN PD) and 0.89 ± 0.05 (Expert PD), with corresponding reductions in 95% HD from 11.75 mm to 4.86 mm and 3.27 mm, respectively. Fiducial separation was also reduced from 8.95 mm to 4.07 mm (CycleGAN PD) and 4.11 mm (Expert PD) (p < 0.05). Improvements in alignment were also observed for the bladder and rectum, highlighting the method's robustness.

Conclusion: A hybrid similarity metric that uses CycleGAN-based auto-segmentation presents a promising avenue for advancing DIR accuracy in ART. The study's findings suggest the potential for substantial enhancements in DIR accuracy by combining AI-based image correction and auto-segmentation with classical DIR.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Grade Inflation in Generative Models. A recent evaluation on the performance of LLMs on radiation oncology physics using questions of randomly shuffled options. A Systematic Computational Framework for Practical Identifiability Analysis in Mathematical Models Arising from Biology. Back to the Continuous Attractor. Inferring resource competition in microbial communities from time series.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1