Mask R-CNN assisted diagnosis of spinal tuberculosis.

IF 1.4 3区 医学 Q3 INSTRUMENTS & INSTRUMENTATION Journal of X-Ray Science and Technology Pub Date : 2025-01-01 Epub Date: 2024-12-24 DOI:10.1177/08953996241290326
Wenjun Li, Yanfan Li, Huan Peng, Wenjun Liang
{"title":"Mask R-CNN assisted diagnosis of spinal tuberculosis.","authors":"Wenjun Li, Yanfan Li, Huan Peng, Wenjun Liang","doi":"10.1177/08953996241290326","DOIUrl":null,"url":null,"abstract":"<p><p>The prevalence of spinal tuberculosis (ST) is particularly high in underdeveloped regions with inadequate medical conditions. This not only leads to misdiagnosis and delays in treatment progress but also contributes to the continued transmission of tuberculosis bacteria, posing a risk to other individuals. Currently, CT imaging is extensively utilized in computer-aided diagnosis (CAD). The main features of ST on CT images include bone destruction, osteosclerosis, sequestration formation, and intervertebral disc damage. However, manual diagnosis by doctors may result in subjective judgments and misdiagnosis. Therefore, an accurate and objective method is needed for diagnosing of spinal tuberculosis. In this paper, we put forward an assistive diagnostic approach for spinal tuberculosis that is based on deep learning. The approach uses the Mask R-CNN model. Moreover, we modify the original model network by incorporating the ResPath and cbam* to improve the performance metrics, namely <math><mi>m</mi><mi>A</mi><msub><mi>P</mi><mrow><mrow><mi>small</mi></mrow></mrow></msub></math> and <i>F1-score</i>. Meanwhile, other deep learning models such as Faster-RCNN and SSD were also compared. Experimental results demonstrate that the enhanced model can effectively identify spinal tuberculosis lesions, with an <math><mi>m</mi><mi>A</mi><msub><mi>P</mi><mrow><mrow><mi>small</mi></mrow></mrow></msub></math> of 0.9175, surpassing the original model's 0.8340, and an <i>F1-score</i> of 0.9335, outperforming the original model's 0.8657.</p>","PeriodicalId":49948,"journal":{"name":"Journal of X-Ray Science and Technology","volume":" ","pages":"120-133"},"PeriodicalIF":1.4000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of X-Ray Science and Technology","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1177/08953996241290326","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/12/24 0:00:00","PubModel":"Epub","JCR":"Q3","JCRName":"INSTRUMENTS & INSTRUMENTATION","Score":null,"Total":0}
引用次数: 0

Abstract

The prevalence of spinal tuberculosis (ST) is particularly high in underdeveloped regions with inadequate medical conditions. This not only leads to misdiagnosis and delays in treatment progress but also contributes to the continued transmission of tuberculosis bacteria, posing a risk to other individuals. Currently, CT imaging is extensively utilized in computer-aided diagnosis (CAD). The main features of ST on CT images include bone destruction, osteosclerosis, sequestration formation, and intervertebral disc damage. However, manual diagnosis by doctors may result in subjective judgments and misdiagnosis. Therefore, an accurate and objective method is needed for diagnosing of spinal tuberculosis. In this paper, we put forward an assistive diagnostic approach for spinal tuberculosis that is based on deep learning. The approach uses the Mask R-CNN model. Moreover, we modify the original model network by incorporating the ResPath and cbam* to improve the performance metrics, namely mAPsmall and F1-score. Meanwhile, other deep learning models such as Faster-RCNN and SSD were also compared. Experimental results demonstrate that the enhanced model can effectively identify spinal tuberculosis lesions, with an mAPsmall of 0.9175, surpassing the original model's 0.8340, and an F1-score of 0.9335, outperforming the original model's 0.8657.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
脊柱结核的假面 R-CNN 辅助诊断。
在医疗条件不充分的欠发达地区,脊柱结核的发病率特别高。这不仅会导致误诊和延误治疗进展,而且还会助长结核病细菌的持续传播,对其他人构成风险。目前,CT成像在计算机辅助诊断(CAD)中得到了广泛的应用。ST在CT图像上的主要特征包括骨破坏、骨硬化、隔离形成和椎间盘损伤。然而,医生的人工诊断可能导致主观判断和误诊。因此,需要一种准确、客观的方法来诊断脊柱结核。本文提出了一种基于深度学习的脊柱结核辅助诊断方法。该方法使用Mask R-CNN模型。此外,我们通过加入ResPath和cbam*来修改原始模型网络,以提高性能指标,即mAPsmall和F1-score。同时,对Faster-RCNN、SSD等其他深度学习模型也进行了比较。实验结果表明,增强模型能够有效识别脊柱结核病变,mAPsmall为0.9175,优于原模型的0.8340,f1得分为0.9335,优于原模型的0.8657。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
4.90
自引率
23.30%
发文量
150
审稿时长
3 months
期刊介绍: Research areas within the scope of the journal include: Interaction of x-rays with matter: x-ray phenomena, biological effects of radiation, radiation safety and optical constants X-ray sources: x-rays from synchrotrons, x-ray lasers, plasmas, and other sources, conventional or unconventional Optical elements: grazing incidence optics, multilayer mirrors, zone plates, gratings, other diffraction optics Optical instruments: interferometers, spectrometers, microscopes, telescopes, microprobes
期刊最新文献
Optimizing cancer classification: A metaheuristic-driven review of feature selection and deep learning approaches. X-ray white beam based 26.7 Hz dynamic tomography. Promptable segmentation of CT lung lesions based on improved U-Net and Segment Anything model (SAM). Corrigendum to "Promptable segmentation of CT lung lesions based on improved U-Net and Segment Anything model (SAM)". Radiomics meets transformers: A novel approach to tumor segmentation and classification in mammography for breast cancer.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1