Automated segmentation and labeling of subcutaneous mouse implants at 14.1T

IF 1.3 Q3 ENGINEERING, ELECTRICAL & ELECTRONIC Frontiers in signal processing Pub Date : 2023-08-21 DOI:10.3389/frsip.2023.1155618
Julien Adda, G. Bioley, D. Van de Ville, C. Cudalbu, M. G. Preti, N. Gninenko
{"title":"Automated segmentation and labeling of subcutaneous mouse implants at 14.1T","authors":"Julien Adda, G. Bioley, D. Van de Ville, C. Cudalbu, M. G. Preti, N. Gninenko","doi":"10.3389/frsip.2023.1155618","DOIUrl":null,"url":null,"abstract":"Magnetic resonance imaging (MRI) is a valuable tool for studying subcutaneous implants in rodents, providing non-invasive insight into biomaterial conformability and longitudinal characterization. However, considerable variability in existing image analysis techniques, manual segmentation and labeling, as well as the lack of reference atlases as opposed to brain imaging, all render the manual implant segmentation task tedious and extremely time-consuming. To this end, the development of automated and robust segmentation pipelines is a necessary addition to the tools available in rodent imaging research. In this work, we presented and compared commonly used image processing contrast-based segmentation approaches—namely, Canny edge detection, Otsu’s single and multi-threshold methods, and a combination of the latter with morphological operators—with more recently introduced convolutional neural network (CNN-) based models, such as the U-Net and nnU-Net (“no-new-net”). These fully automated end-to-end state-of-the-art neural architectures have shown great promise in online segmentation challenges. We adapted them to the implant segmentation task in mice MRI, with both 2D and 3D implementations. Our results demonstrated the superiority of the 3D nnU-Net model, which is able to robustly segment the implants with an average Dice accuracy of 0.915, and an acceptable absolute volume prediction error of 5.74%. Additionally, we provide researchers in the field with an automated segmentation pipeline in Python, leveraging these CNN-based implementations, and allowing to drastically reduce the manual labeling time from approximately 90 min to less than 5 min (292.959 s ± 6.49 s, N = 30 predictions). The latter addresses the bottleneck of constrained animal experimental time in pre-clinical rodent research.","PeriodicalId":93557,"journal":{"name":"Frontiers in signal processing","volume":"18 1","pages":""},"PeriodicalIF":1.3000,"publicationDate":"2023-08-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in signal processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3389/frsip.2023.1155618","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Magnetic resonance imaging (MRI) is a valuable tool for studying subcutaneous implants in rodents, providing non-invasive insight into biomaterial conformability and longitudinal characterization. However, considerable variability in existing image analysis techniques, manual segmentation and labeling, as well as the lack of reference atlases as opposed to brain imaging, all render the manual implant segmentation task tedious and extremely time-consuming. To this end, the development of automated and robust segmentation pipelines is a necessary addition to the tools available in rodent imaging research. In this work, we presented and compared commonly used image processing contrast-based segmentation approaches—namely, Canny edge detection, Otsu’s single and multi-threshold methods, and a combination of the latter with morphological operators—with more recently introduced convolutional neural network (CNN-) based models, such as the U-Net and nnU-Net (“no-new-net”). These fully automated end-to-end state-of-the-art neural architectures have shown great promise in online segmentation challenges. We adapted them to the implant segmentation task in mice MRI, with both 2D and 3D implementations. Our results demonstrated the superiority of the 3D nnU-Net model, which is able to robustly segment the implants with an average Dice accuracy of 0.915, and an acceptable absolute volume prediction error of 5.74%. Additionally, we provide researchers in the field with an automated segmentation pipeline in Python, leveraging these CNN-based implementations, and allowing to drastically reduce the manual labeling time from approximately 90 min to less than 5 min (292.959 s ± 6.49 s, N = 30 predictions). The latter addresses the bottleneck of constrained animal experimental time in pre-clinical rodent research.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
14.1T小鼠皮下植入物的自动分割和标记
磁共振成像(MRI)是研究啮齿动物皮下植入物的一种有价值的工具,提供了对生物材料一致性和纵向特征的非侵入性洞察。然而,现有的图像分析技术存在很大的可变性,人工分割和标记,以及缺乏与脑成像相反的参考地图集,这些都使得人工植入物分割任务繁琐且极其耗时。为此,开发自动化和健壮的分割管道是啮齿动物成像研究中可用工具的必要补充。在这项工作中,我们提出并比较了常用的基于对比度的图像处理分割方法-即Canny边缘检测,Otsu的单阈值和多阈值方法,以及后者与形态学算子的组合-以及最近引入的基于卷积神经网络(CNN)的模型,如U-Net和nnU-Net(“no-new-net”)。这些完全自动化的端到端最先进的神经架构在在线细分挑战中显示出巨大的前景。我们将它们适应于小鼠MRI中的植入物分割任务,包括2D和3D实现。我们的研究结果证明了3D nnU-Net模型的优越性,该模型能够稳健地分割植入体,平均Dice精度为0.915,绝对体积预测误差为5.74%。此外,我们为该领域的研究人员提供了一个Python自动分割管道,利用这些基于cnn的实现,并允许大幅减少人工标记时间,从大约90分钟减少到不到5分钟(292.959秒±6.49秒,N = 30个预测)。后者解决了临床前啮齿动物研究中动物实验时间有限的瓶颈。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A mini-review of signal processing techniques for RIS-assisted near field THz communication Editorial: Signal processing in computational video and video streaming Editorial: Editor’s challenge—image processing Improved circuitry and post-processing for interleaved fast-scan cyclic voltammetry and electrophysiology measurements Bounds for Haralick features in synthetic images with sinusoidal gradients
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1