使用深度学习技术的子宫颈类型和宫颈癌分类系统。

IF 1.3 Q4 ENGINEERING, BIOMEDICAL Medical Devices-Evidence and Research Pub Date : 2022-06-16 eCollection Date: 2022-01-01 DOI:10.2147/MDER.S366303
Lidiya Wubshet Habtemariam, Elbetel Taye Zewde, Gizeaddis Lamesgin Simegn
{"title":"使用深度学习技术的子宫颈类型和宫颈癌分类系统。","authors":"Lidiya Wubshet Habtemariam,&nbsp;Elbetel Taye Zewde,&nbsp;Gizeaddis Lamesgin Simegn","doi":"10.2147/MDER.S366303","DOIUrl":null,"url":null,"abstract":"<p><strong>Purpose: </strong>Cervical cancer is the 4th most common cancer among women, worldwide. Incidence and mortality rates are consistently increasing, especially in developing countries, due to the shortage of screening facilities, limited skilled professionals, and lack of awareness. Cervical cancer is screened using visual inspection after application of acetic acid (VIA), papanicolaou (Pap) test, human papillomavirus (HPV) test and histopathology test. Inter- and intra-observer variability may occur during the manual diagnosis procedure, resulting in misdiagnosis. The purpose of this study was to develop an integrated and robust system for automatic cervix type and cervical cancer classification using deep learning techniques.</p><p><strong>Methods: </strong>4005 colposcopy images and 915 histopathology images were collected from different local health facilities and online public datasets. Different pre-trained models were trained and compared for cervix type classification. Prior to classification, the region of interest (ROI) was extracted from cervix images by training and validating a lightweight MobileNetv2-YOLOv3 model to detect the transformation region. The extracted cervix images were then fed to the EffecientNetb0 model for cervix type classification. For cervical cancer classification, an EffecientNetB0 pre-trained model was trained and validated using histogram matched histopathological images.</p><p><strong>Results: </strong>Mean average precision (mAP) of 99.88% for the region of interest (ROI) extraction, and test accuracies of 96.84% and 94.5% were achieved for the cervix type and cervical cancer classification, respectively.</p><p><strong>Conclusion: </strong>The experimental results demonstrate that the proposed system can be used as a decision support tool in the diagnosis of cervical cancer, especially in low resources settings, where the expertise and the means are limited.</p>","PeriodicalId":47140,"journal":{"name":"Medical Devices-Evidence and Research","volume":" ","pages":"163-176"},"PeriodicalIF":1.3000,"publicationDate":"2022-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ftp.ncbi.nlm.nih.gov/pub/pmc/oa_pdf/e3/52/mder-15-163.PMC9208738.pdf","citationCount":"7","resultStr":"{\"title\":\"Cervix Type and Cervical Cancer Classification System Using Deep Learning Techniques.\",\"authors\":\"Lidiya Wubshet Habtemariam,&nbsp;Elbetel Taye Zewde,&nbsp;Gizeaddis Lamesgin Simegn\",\"doi\":\"10.2147/MDER.S366303\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Purpose: </strong>Cervical cancer is the 4th most common cancer among women, worldwide. Incidence and mortality rates are consistently increasing, especially in developing countries, due to the shortage of screening facilities, limited skilled professionals, and lack of awareness. Cervical cancer is screened using visual inspection after application of acetic acid (VIA), papanicolaou (Pap) test, human papillomavirus (HPV) test and histopathology test. Inter- and intra-observer variability may occur during the manual diagnosis procedure, resulting in misdiagnosis. The purpose of this study was to develop an integrated and robust system for automatic cervix type and cervical cancer classification using deep learning techniques.</p><p><strong>Methods: </strong>4005 colposcopy images and 915 histopathology images were collected from different local health facilities and online public datasets. Different pre-trained models were trained and compared for cervix type classification. Prior to classification, the region of interest (ROI) was extracted from cervix images by training and validating a lightweight MobileNetv2-YOLOv3 model to detect the transformation region. The extracted cervix images were then fed to the EffecientNetb0 model for cervix type classification. For cervical cancer classification, an EffecientNetB0 pre-trained model was trained and validated using histogram matched histopathological images.</p><p><strong>Results: </strong>Mean average precision (mAP) of 99.88% for the region of interest (ROI) extraction, and test accuracies of 96.84% and 94.5% were achieved for the cervix type and cervical cancer classification, respectively.</p><p><strong>Conclusion: </strong>The experimental results demonstrate that the proposed system can be used as a decision support tool in the diagnosis of cervical cancer, especially in low resources settings, where the expertise and the means are limited.</p>\",\"PeriodicalId\":47140,\"journal\":{\"name\":\"Medical Devices-Evidence and Research\",\"volume\":\" \",\"pages\":\"163-176\"},\"PeriodicalIF\":1.3000,\"publicationDate\":\"2022-06-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ftp.ncbi.nlm.nih.gov/pub/pmc/oa_pdf/e3/52/mder-15-163.PMC9208738.pdf\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Medical Devices-Evidence and Research\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2147/MDER.S366303\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2022/1/1 0:00:00\",\"PubModel\":\"eCollection\",\"JCR\":\"Q4\",\"JCRName\":\"ENGINEERING, BIOMEDICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Medical Devices-Evidence and Research","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2147/MDER.S366303","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2022/1/1 0:00:00","PubModel":"eCollection","JCR":"Q4","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 7

摘要

目的:宫颈癌是全球第四大最常见的女性癌症。发病率和死亡率持续上升,特别是在发展中国家,原因是缺乏筛查设施、熟练专业人员有限以及缺乏认识。宫颈癌是在应用醋酸(VIA)、巴氏(Pap)试验、人乳头瘤病毒(HPV)试验和组织病理学试验后使用目视检查进行筛查的。在人工诊断过程中,可能会发生观察者之间和观察者内部的变异,从而导致误诊。本研究的目的是利用深度学习技术开发一个集成的、强大的自动子宫颈癌类型和子宫颈癌分类系统。方法:收集来自不同地方卫生机构和在线公共数据集的4005张阴道镜图像和915张组织病理学图像。对不同的预训练模型进行训练和比较,用于宫颈类型分类。在分类之前,通过训练和验证轻量级的MobileNetv2-YOLOv3模型来检测转换区域,从子宫颈图像中提取感兴趣区域(ROI)。然后将提取的子宫颈图像输入到effecentnetb0模型中进行子宫颈类型分类。对于宫颈癌的分类,使用直方图匹配的组织病理学图像训练和验证了一个EffecientNetB0预训练模型。结果:感兴趣区域(ROI)提取的平均精密度(mAP)为99.88%,宫颈类型和宫颈癌分类的检测准确率分别为96.84%和94.5%。结论:实验结果表明,该系统可作为宫颈癌诊断的决策支持工具,特别是在资源匮乏、专业知识和手段有限的情况下。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Cervix Type and Cervical Cancer Classification System Using Deep Learning Techniques.

Purpose: Cervical cancer is the 4th most common cancer among women, worldwide. Incidence and mortality rates are consistently increasing, especially in developing countries, due to the shortage of screening facilities, limited skilled professionals, and lack of awareness. Cervical cancer is screened using visual inspection after application of acetic acid (VIA), papanicolaou (Pap) test, human papillomavirus (HPV) test and histopathology test. Inter- and intra-observer variability may occur during the manual diagnosis procedure, resulting in misdiagnosis. The purpose of this study was to develop an integrated and robust system for automatic cervix type and cervical cancer classification using deep learning techniques.

Methods: 4005 colposcopy images and 915 histopathology images were collected from different local health facilities and online public datasets. Different pre-trained models were trained and compared for cervix type classification. Prior to classification, the region of interest (ROI) was extracted from cervix images by training and validating a lightweight MobileNetv2-YOLOv3 model to detect the transformation region. The extracted cervix images were then fed to the EffecientNetb0 model for cervix type classification. For cervical cancer classification, an EffecientNetB0 pre-trained model was trained and validated using histogram matched histopathological images.

Results: Mean average precision (mAP) of 99.88% for the region of interest (ROI) extraction, and test accuracies of 96.84% and 94.5% were achieved for the cervix type and cervical cancer classification, respectively.

Conclusion: The experimental results demonstrate that the proposed system can be used as a decision support tool in the diagnosis of cervical cancer, especially in low resources settings, where the expertise and the means are limited.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Medical Devices-Evidence and Research
Medical Devices-Evidence and Research ENGINEERING, BIOMEDICAL-
CiteScore
2.80
自引率
0.00%
发文量
41
审稿时长
16 weeks
期刊最新文献
Initial Validation of the NOL Nociception Level Index® Monitoring System in Black and Multiracial People. Potential of Aluminum Drug Packages with Press-Through Packaging Considering Usability for a Wide Range of Users. The Application Effect of Fine Management Combined with Man-Machine Fixation Mode in Reducing the Attrition Rate of Laparoscopic Instruments-A Non-Randomized, Concurrent Controlled Study. Patients with Growth-Related Disorders and Caregivers Prefer the Somapacitan Device to the Somatrogon Device: Results from a Randomized Crossover Study Assessing Device Preference and Ease of Use Following Simulated Injections. May Glymphatic Drainage Improve Life Quality in Progressive Multiple Sclerosis Outpatients?
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1