Fast and Accurate U-Net Model for Fetal Ultrasound Image Segmentation.

IF 2.5 4区 医学 Q1 ACOUSTICS Ultrasonic Imaging Pub Date : 2022-01-01 Epub Date: 2022-01-06 DOI:10.1177/01617346211069882
Vahid Ashkani Chenarlogh, Mostafa Ghelich Oghli, Ali Shabanzadeh, Nasim Sirjani, Ardavan Akhavan, Isaac Shiri, Hossein Arabi, Morteza Sanei Taheri, Mohammad Kazem Tarzamni
{"title":"Fast and Accurate U-Net Model for Fetal Ultrasound Image Segmentation.","authors":"Vahid Ashkani Chenarlogh,&nbsp;Mostafa Ghelich Oghli,&nbsp;Ali Shabanzadeh,&nbsp;Nasim Sirjani,&nbsp;Ardavan Akhavan,&nbsp;Isaac Shiri,&nbsp;Hossein Arabi,&nbsp;Morteza Sanei Taheri,&nbsp;Mohammad Kazem Tarzamni","doi":"10.1177/01617346211069882","DOIUrl":null,"url":null,"abstract":"<p><p>U-Net based algorithms, due to their complex computations, include limitations when they are used in clinical devices. In this paper, we addressed this problem through a novel U-Net based architecture that called fast and accurate U-Net for medical image segmentation task. The proposed fast and accurate U-Net model contains four tuned 2D-convolutional, 2D-transposed convolutional, and batch normalization layers as its main layers. There are four blocks in the encoder-decoder path. The results of our proposed architecture were evaluated using a prepared dataset for head circumference and abdominal circumference segmentation tasks, and a public dataset (HC18-Grand challenge dataset) for fetal head circumference measurement. The proposed fast network significantly improved the processing time in comparison with U-Net, dilated U-Net, R2U-Net, attention U-Net, and MFP U-Net. It took 0.47 seconds for segmenting a fetal abdominal image. In addition, over the prepared dataset using the proposed accurate model, Dice and Jaccard coefficients were 97.62% and 95.43% for fetal head segmentation, 95.07%, and 91.99% for fetal abdominal segmentation. Moreover, we have obtained the Dice and Jaccard coefficients of 97.45% and 95.00% using the public HC18-Grand challenge dataset. Based on the obtained results, we have concluded that a fine-tuned and a simple well-structured model used in clinical devices can outperform complex models.</p>","PeriodicalId":49401,"journal":{"name":"Ultrasonic Imaging","volume":null,"pages":null},"PeriodicalIF":2.5000,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Ultrasonic Imaging","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1177/01617346211069882","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2022/1/6 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"ACOUSTICS","Score":null,"Total":0}
引用次数: 9

Abstract

U-Net based algorithms, due to their complex computations, include limitations when they are used in clinical devices. In this paper, we addressed this problem through a novel U-Net based architecture that called fast and accurate U-Net for medical image segmentation task. The proposed fast and accurate U-Net model contains four tuned 2D-convolutional, 2D-transposed convolutional, and batch normalization layers as its main layers. There are four blocks in the encoder-decoder path. The results of our proposed architecture were evaluated using a prepared dataset for head circumference and abdominal circumference segmentation tasks, and a public dataset (HC18-Grand challenge dataset) for fetal head circumference measurement. The proposed fast network significantly improved the processing time in comparison with U-Net, dilated U-Net, R2U-Net, attention U-Net, and MFP U-Net. It took 0.47 seconds for segmenting a fetal abdominal image. In addition, over the prepared dataset using the proposed accurate model, Dice and Jaccard coefficients were 97.62% and 95.43% for fetal head segmentation, 95.07%, and 91.99% for fetal abdominal segmentation. Moreover, we have obtained the Dice and Jaccard coefficients of 97.45% and 95.00% using the public HC18-Grand challenge dataset. Based on the obtained results, we have concluded that a fine-tuned and a simple well-structured model used in clinical devices can outperform complex models.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
快速准确的U-Net胎儿超声图像分割模型。
基于U-Net的算法由于其复杂的计算,在临床设备中使用时存在局限性。在本文中,我们通过一种新的基于U-Net的架构来解决这一问题,该架构称为快速准确的U-Net,用于医学图像分割任务。提出的快速准确的U-Net模型包含四个调谐二维卷积层、二维转置卷积层和批处理归一化层作为其主要层。在编码器-解码器路径中有四个块。我们提出的架构的结果使用一个准备好的头围和腹围分割任务数据集和一个公开的胎儿头围测量数据集(HC18-Grand challenge数据集)进行评估。与U-Net、扩张型U-Net、R2U-Net、注意力U-Net和MFP U-Net相比,该快速网络显著提高了处理时间。胎儿腹部图像的分割耗时0.47秒。此外,在使用所提出的精确模型制备的数据集上,胎儿头部分割的Dice和Jaccard系数分别为97.62%和95.43%,胎儿腹部分割的Dice和Jaccard系数分别为95.07%和91.99%。此外,我们使用公共HC18-Grand challenge数据集获得了Dice和Jaccard系数分别为97.45%和95.00%。基于所获得的结果,我们得出结论,在临床设备中使用的微调和简单的结构良好的模型可以优于复杂的模型。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Ultrasonic Imaging
Ultrasonic Imaging 医学-工程:生物医学
CiteScore
5.10
自引率
8.70%
发文量
15
审稿时长
>12 weeks
期刊介绍: Ultrasonic Imaging provides rapid publication for original and exceptional papers concerned with the development and application of ultrasonic-imaging technology. Ultrasonic Imaging publishes articles in the following areas: theoretical and experimental aspects of advanced methods and instrumentation for imaging
期刊最新文献
Development of a Polymer Ultrasound Contrast Agent Incorporating Nested Carbon Nanodots. Automated Deep Learning-Based Finger Joint Segmentation in 3-D Ultrasound Images With Limited Dataset. CBAM-RIUnet: Breast Tumor Segmentation With Enhanced Breast Ultrasound and Test-Time Augmentation Deep learning Radiomics Based on Two-Dimensional Ultrasound for Predicting the Efficacy of Neoadjuvant Chemotherapy in Breast Cancer SPGAN Optimized by Piranha Foraging Optimization for Thyroid Nodule Classification in Ultrasound Images
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1