Early detection of gynecological malignancies using ensemble deep learning models: ResNet50 and inception V3

Chetna Vaid Kwatra , Harpreet Kaur , Monika Mangla , Arun Singh , Swapnali N. Tambe , Saiprasad Potharaju
{"title":"Early detection of gynecological malignancies using ensemble deep learning models: ResNet50 and inception V3","authors":"Chetna Vaid Kwatra ,&nbsp;Harpreet Kaur ,&nbsp;Monika Mangla ,&nbsp;Arun Singh ,&nbsp;Swapnali N. Tambe ,&nbsp;Saiprasad Potharaju","doi":"10.1016/j.imu.2025.101620","DOIUrl":null,"url":null,"abstract":"<div><h3>Background and objective</h3><div>Improving patient outcomes and lowering death rates depend on the early identification of gynecological cancers. This work intends to improve the accuracy and dependability of early gynecological tumor diagnosis by means of a hybrid deep learning model combining ResNet50 and Inception v3 architectures.</div></div><div><h3>Methods</h3><div>The proposed ensemble model combines multi-scale feature extraction of Inception v3 with the deep residual learning capability of ResNet50. A significant number of gynecological images were employed for training, testing, and assessment of the proposed model. By entailing accuracy, sensitivity, specificity, and F1 score, among other parameters the performance of the model was assessed.</div></div><div><h3>Results</h3><div>The first experiment depicted displays that the ensemble model performed better than single models with a training accuracy of 99.80 %, a validation accuracy of 99.80 %, and a test accuracy of 99.80 %. Comparing the two studies done in the current research, the model has shown to have a high sensitivity of 99 %, specificity of 99 %, and F1 score of 0.99, making it better in the identification of gynecological cancers and significantly reducing low true negatives and low true positives.</div></div><div><h3>Conclusions</h3><div>Ensembling of ResNet50 with Inception v3 for early diagnosis of gynecological cancers is promising and reproducible. Thus, according to the presented results, this method can contribute to the diagnoses of diseases by doctors quickly and accurately and, therefore, improve the treatment outcomes and the patient's health</div></div>","PeriodicalId":13953,"journal":{"name":"Informatics in Medicine Unlocked","volume":"53 ","pages":"Article 101620"},"PeriodicalIF":0.0000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Informatics in Medicine Unlocked","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2352914825000085","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Medicine","Score":null,"Total":0}
引用次数: 0

Abstract

Background and objective

Improving patient outcomes and lowering death rates depend on the early identification of gynecological cancers. This work intends to improve the accuracy and dependability of early gynecological tumor diagnosis by means of a hybrid deep learning model combining ResNet50 and Inception v3 architectures.

Methods

The proposed ensemble model combines multi-scale feature extraction of Inception v3 with the deep residual learning capability of ResNet50. A significant number of gynecological images were employed for training, testing, and assessment of the proposed model. By entailing accuracy, sensitivity, specificity, and F1 score, among other parameters the performance of the model was assessed.

Results

The first experiment depicted displays that the ensemble model performed better than single models with a training accuracy of 99.80 %, a validation accuracy of 99.80 %, and a test accuracy of 99.80 %. Comparing the two studies done in the current research, the model has shown to have a high sensitivity of 99 %, specificity of 99 %, and F1 score of 0.99, making it better in the identification of gynecological cancers and significantly reducing low true negatives and low true positives.

Conclusions

Ensembling of ResNet50 with Inception v3 for early diagnosis of gynecological cancers is promising and reproducible. Thus, according to the presented results, this method can contribute to the diagnoses of diseases by doctors quickly and accurately and, therefore, improve the treatment outcomes and the patient's health

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
求助全文
约1分钟内获得全文 去求助
来源期刊
Informatics in Medicine Unlocked
Informatics in Medicine Unlocked Medicine-Health Informatics
CiteScore
9.50
自引率
0.00%
发文量
282
审稿时长
39 days
期刊介绍: Informatics in Medicine Unlocked (IMU) is an international gold open access journal covering a broad spectrum of topics within medical informatics, including (but not limited to) papers focusing on imaging, pathology, teledermatology, public health, ophthalmological, nursing and translational medicine informatics. The full papers that are published in the journal are accessible to all who visit the website.
期刊最新文献
Usability and accessibility in mHealth stroke apps: An empirical assessment Spatiotemporal chest wall movement analysis using depth sensor imaging for detecting respiratory asynchrony Regression and classification of Windkessel parameters from non-invasive cardiovascular quantities using a fully connected neural network Patient2Trial: From patient to participant in clinical trials using large language models Structural modification of Naproxen; physicochemical, spectral, medicinal, and pharmacological evaluation
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1