{"title":"Robust assessment of cervical precancerous lesions from pre- and post-acetic acid cervicography by combining deep learning and medical guidelines","authors":"Siti Nurmaini , Patiyus Agustiyansyah , Muhammad Naufal Rachmatullah , Firdaus Firdaus , Annisa Darmawahyuni , Bambang Tutuko , Ade Iriani Sapitri , Anggun Islami , Akhiar Wista Arum , Rizal Sanif , Irawan Sastradinata , Legiran Legiran , Radiyati Umi Partan","doi":"10.1016/j.imu.2024.101609","DOIUrl":null,"url":null,"abstract":"<div><div>Cervical cancer remains a major public health challenge, particularly in low-resource settings where access to regular screening and expert medical evaluation is limited. Traditional visual inspection with acetic acid (VIA) has been widely used for cervical cancer screening but is subjective and highly dependent on the expertise of the healthcare provider. This study presents a comprehensive methodology for decision-making regarding cervical precancerous lesions using cervicograms taken before and after the application of acetic acid. By leveraging the power of the deep learning (DL) model with You Only Look Once (Yolo) version 8, Slicing Aided Hyper Inference (SAHI), and oncology medical guidelines, the system aims to improve the accuracy and consistency of VIA assessments. The method involves training a Yolov8xl model on our cervicogram dataset, annotated by two oncologists using VIA screening results, to distinguish between the cervical area, columnar area, and lesions. The model is designed to process cervicography images taken both before and after the application of acetic acid, capturing the dynamic changes in tissue appearance indicative of precancerous conditions. The automated evaluation system demonstrated high sensitivity and specificity in detecting cervical lesions with 90.78 % accuracy, 91.67 % sensitivity, and 90.96 % specificity, outperforming other existing methods. This work represents a significant step towards deploying AI-driven solutions in cervical cancer screening, potentially reducing the global burden of the disease. It can be integrated into existing screening programs, providing a valuable tool for early detection and intervention, especially in regions with limited access to trained medical personnel.</div></div>","PeriodicalId":13953,"journal":{"name":"Informatics in Medicine Unlocked","volume":"52 ","pages":"Article 101609"},"PeriodicalIF":0.0000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Informatics in Medicine Unlocked","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2352914824001667","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Medicine","Score":null,"Total":0}
引用次数: 0
Abstract
Cervical cancer remains a major public health challenge, particularly in low-resource settings where access to regular screening and expert medical evaluation is limited. Traditional visual inspection with acetic acid (VIA) has been widely used for cervical cancer screening but is subjective and highly dependent on the expertise of the healthcare provider. This study presents a comprehensive methodology for decision-making regarding cervical precancerous lesions using cervicograms taken before and after the application of acetic acid. By leveraging the power of the deep learning (DL) model with You Only Look Once (Yolo) version 8, Slicing Aided Hyper Inference (SAHI), and oncology medical guidelines, the system aims to improve the accuracy and consistency of VIA assessments. The method involves training a Yolov8xl model on our cervicogram dataset, annotated by two oncologists using VIA screening results, to distinguish between the cervical area, columnar area, and lesions. The model is designed to process cervicography images taken both before and after the application of acetic acid, capturing the dynamic changes in tissue appearance indicative of precancerous conditions. The automated evaluation system demonstrated high sensitivity and specificity in detecting cervical lesions with 90.78 % accuracy, 91.67 % sensitivity, and 90.96 % specificity, outperforming other existing methods. This work represents a significant step towards deploying AI-driven solutions in cervical cancer screening, potentially reducing the global burden of the disease. It can be integrated into existing screening programs, providing a valuable tool for early detection and intervention, especially in regions with limited access to trained medical personnel.
期刊介绍:
Informatics in Medicine Unlocked (IMU) is an international gold open access journal covering a broad spectrum of topics within medical informatics, including (but not limited to) papers focusing on imaging, pathology, teledermatology, public health, ophthalmological, nursing and translational medicine informatics. The full papers that are published in the journal are accessible to all who visit the website.